International Journal Papers
Permanent URI for this collectionhttp://dl.cerist.dz/handle/CERIST/17
Browse
134 results
Search Results
Item A cooperative framework for automated segmentation of tumors in brain MRI images(Springer, 2023-03) Hadjadj, ZinebBrain tumor segmentation from 2D Magnetic Resonance Images (MRI) is an important task for several applications in the field of medical analysis. Commonly, this task is performed manually by medical professionals, but it is not always obvious due to similarities between tumors and normal tissue and variations in tumor appearance. Therefore, the automation of medical image segmentation remains a real challenge that has attracted the attention of several researchers in recent years. Instead of choosing between region and contour approaches, in this article, we propose a region-edge cooperative method for brain tumor segmentation from MRI images. The region approach used is support vector machines (SVMs), one of the popular and highly motivated classification methods, the method distinguishes between normal and abnormal pixels based on some features: intensity and texture. To control and guide the segmentation region, we take advantage of the Ron Kimmel geodesic Active Contour Model (ACM) which produces a good delimitation of the boundaries of the object. The two methods have been cooperated sequentially in order to obtain a flexible and effective system for brain tumor segmentation. Experimental studies are performed on synthetic and real 2D MRI images of various modalities from the radiology unit of the university hospital center in Bab El Oued Algeria. The used MRI images represent various tumor shapes, locations, sizes, and intensities. The proposed cooperative framework outperformed SVM-based segmentation and ACM-based segmentation when executed independently.Item Efficient Machine Learning-based Approach for Brain Tumor Detection Using the CAD System(Taylor & Francis, 2023-04) Guerroudji, Mohamed Amine; Hadjadj, Zineb; Lichouri, Mohamed; Amara, Kahina; Zenati, NadiaMedical research has focused on improving diagnosis through medical imaging in recent decades. Computer Assisted Diagnosis (CAD) systems have been developed to help doctors identify suspicious areas of interest, particularly those with cancer-like characteristics. CAD systems employ various algorithms and techniques to extract important numerical measurements from medical images that clinicians can use to evaluate patient conditions. This study proposes a statistical classification-based approach to efficient brain cancer detection. The proposed approach operates in three stages: first, Gradient Vector Flow (GVF) Snake models and mathematical morphology techniques retrieve regions of interest. The second stage characterizes these regions using morphological and textural parameters. Finally, a Bayesian network uses this description as input to identify malignant and benign cancer classes. We also compared the performance of the Bayesian network with other popular classification algorithms, including SVM, MLP, KNN, Random Forest, Decision Tree, XGBoost, LGBM, Gaussian Process, and RBF SVM. The results showed the superiority of the Bayesian network for the task of brain tumor classification. The proposed approach has been experimentally validated, with a sensitivity of 100% and a classification accuracy of over 98% for tumors, demonstrating the high efficiency of cancer cell segmentation.Item Genetic-Based Algorithm for Task Scheduling in Fog–Cloud Environment(Springer) Khiat, Abdelhamid; Haddadi, Mohamed; Bahnes, NaceraOver the past few years, there has been a consistent increase in the number of Internet of Things (IoT) devices utilizing Cloud services. However, this growth has brought about new challenges, particularly in terms of latency. To tackle this issue, fog computing has emerged as a promising trend. By incorporating additional resources at the edge of the Cloud architecture, the fog–cloud architecture aims to reduce latency by bringing processing closer to end-users. This trend has significant implications for enhancing the overall performance and user experience of IoT systems. One major challenge in achieving this is minimizing latency without increasing total energy consumption. To address this challenge, it is crucial to employ a powerful scheduling solution. Unfortunately, this scheduling problem is generally known as NP-hard, implying that no optimal solution that can be obtained in a reasonable time has been discovered to date. In this paper, we focus on the problem of task scheduling in a fog–cloud based environment. Therefore, we propose a novel genetic-based algorithm called GAMMR that aims to achieve an optimal balance between total consumed energy and total response time. We evaluate the proposed algorithm using simulations on 8 datasets of varying sizes. The results demonstrate that our proposed GAMMR algorithm outperforms the standard genetic algorithm in all tested cases, with an average improvement of 3.4% in the normalized function.Item Networked Wireless Sensors, Active RFID, and Handheld Devices for Modern Car Park Management: WSN, RFID, and Mob Devs for Car Park Management(IGI Global, 2015-07-01) Djenouri, Djamel; Karbab, Elmouatezbillah; Boulkaboul, Sahar; Bagula, AntoineNetworked wireless sensors, actuators, RFID, and mobile computing technologies are explored in this paper on the quest for modern car park management systems with sophisticated services over the emerging internet of things (IoT), where things such as ubiquitous handheld computers, smart ubiquitous sensors, RFID readers and tags are expected to be interconnected to virtually form networks that enable a variety of services. After an overview of the literature, the authors propose a scalable and lowcost car parking framework (CPF) based on the integration of aforementioned technologies. A preliminary prototype implementation has been performed, as well as experimentation of some modules of the proposed CPF. The results demonstrate proof of concept, and particularly reveal that the proposed approach for WSN deployment considerably reduces the cost and energy consumption compared to existing solutions.Item Privacy-preserving remote deep-learning-based inference under constrained client-side environment(Springer, 2023) Boulemtafes, Amine; Derhab, Abdelouahid; Ait Ali Braham, Nassim; Challal , YacineRemote deep learning paradigm raises important privacy concerns related to clients sensitive data and deep learning models. However, dealing with such concerns may come at the expense of more client-side overhead, which does not fit applications relying on constrained environments. In this paper, we propose a privacy-preserving solution for deep-learning-based inference, which ensures effectiveness and privacy, while meeting efficiency requirements of constrained client-side environments. The solution adopts the non-colluding two-server architecture, which prevents accuracy loss as it avoids using approximation of activation functions, and copes with constrained client-side due to low overhead cost. The solution also ensures privacy by leveraging two reversible perturbation techniques in combination with paillier homomorphic encryption scheme. Client-side overhead evaluation compared to the conventional homomorphic encryption approach, achieves up to more than two thousands times improvement in terms of execution time, and up to more than thirty times improvement in terms of the transmitted data size.Item Privacy-preserving deep learning for pervasive health monitoring: a study of environment requirements and existing solutions adequacy(Elsevier, 2022-03) Boulemtafes, Amine; Derhab, Abdelouahid; Challal , YacineIn recent years, deep learning in healthcare applications has attracted considerable attention from research community. They are deployed on powerful cloud infrastructures to process big health data. However, privacy issue arises when sensitive data are offloaded to the remote cloud. In this paper, we focus on pervasive health monitoring applications that allow anywhere and anytime monitoring of patients, such as heart diseases diagnosis, sleep apnea detection, and more recently, early detection of Covid-19. As pervasive health monitoring applications generally operate on constrained client-side environment, it is important to take into consideration these constraints when designing privacy-preserving solutions. This paper aims therefore to review the adequacy of existing privacy-preserving solutions for deep learning in pervasive health monitoring environment. To this end, we identify the privacy-preserving learning scenarios and their corresponding tasks and requirements. Furthermore, we define the evaluation criteria of the reviewed solutions, we discuss them, and highlight open issues for future research.Item Deep learning in pervasive health monitoring, design goals, applications, and architectures: An overview and a brief synthesis(Elsevier, 2021-11) Boulemtafes, Amine; Khemissa, Hamza; Derki, Mohamed Saddek; Amira, Abdelouahab; Djedjig, NabilThe continuous growth of an aging population in some countries, and patients with chronic conditions needs the development of efficient solutions for healthcare. Pervasive Health Monitoring (PHM) is an important pervasive computing application that has the potential to provide patients with a high-quality medical service and enable quick-response alerting of critical conditions. To that end, PHM enables continuous and ubiquitous monitoring of patients' health and wellbeing using Internet of Things (IoT) technologies, such as wearables and ambient sensors. In recent years, deep learning (DL) has attracted a growing interest from the research community to improve PHM applications. In this paper, we discuss the state-of-the-art of DL-based PHM, through identifying, (1) the main PHM applications where DL is successful, (2) design goals and objectives of using DL in PHM, and (3) design notes including DL architectures and data preprocessing. Finally, main advantages, limitations and challenges of the adoption of DL in PHM are discussed.Item PRIviLY: Private Remote Inference over fulLY connected deep networks for pervasive health monitoring with constrained client-side(Elsevier, 2023-09) Boulemtafes, Amine; Derhab, Abdelouahid; Challal, YacineRemote deep learning paradigm enables to better leverage the power of deep neural networks in pervasive health monitoring (PHM) applications, especially by addressing the constrained client-side environment. However, remote deep learning in the context of PHM requires to ensure three properties: (1) meet the high accuracy requirement of the healthcare domain, (2) satisfy the client-side constraints, and (3) cope with the privacy requirements related to the high sensitivity of health data. Different privacy-preserving solutions for remote deep learning exit in the literature but many of them fail to fully address the PHM requirements especially with respect to constrained client-side environments. To that end, we propose PRIviLY, a novel privacy-preserving remote inference solution, designed specifically for the popular Fully Connected Deep Networks (FCDNs). PRIviLY avoids the use of encryption for privacy preservation of sensitive information, in order to fully prevent accuracy loss, and to alleviate the server-side hardware requirements. Besides, PRIviLY adopts a non-colluding two-server architecture, and leverages the linear computations of FCDNs along with reversible random perturbation and permutation techniques in order to preserve privacy of sensitive information, while meeting low overhead requirement of constrained client-sides. At the cloud server, efficiency evaluation shows that PRIviLY achieves an improvement ratio of 4 to more than 15 times for communication, and a minimum improvement ratio of 135 times for computation overhead. At the intermediate server, the minimum improvement ratio is at least more than 10,900 for computation, while for communication, the improvement ratio varies from 5 to more than 21 times. As for the client-side, PRIviLY incurs an additional overhead of about 27% in terms of communication, and between 16% and at most 27% in terms of computation.Item A novel descriptor (LGBQ) based on Gabor filters(Springer, 2023-12-23) Aliradi, Rachid; Ouamane , AbdelmalikRecently, many existing automatic facial verification methods have focused on learning the optimal distance measurements between facials. Especially in the case of learning facial features by similarity which can make the proposed descriptors too weak. To justify filling this gap, we have proposed a new descriptor called Local Binary Gabor Quantization (LGBQ) for 3/2D face verification based on Gabor filters and uses tensor subspace transformation. Our main idea is to binarize the responses of eight Gabor filters based on eight orientations as a binary code which is converted into a decimal number and combines the advantage of three methods: Gabor, LBP, and LPQ. These descriptors provide more robustness to shape variations in face parts such as expression, pose, lighting, and scale. To do this, we have chosen to merge two techniques which are multilinear whitened principal component analysis (MWPCA) and tensor exponential discriminant analysis (TEDA). The experimentation is using two publicly available databases, namely, Bhosphorus, and CASIA 3D face database. The results show the supremacy of our method in terms of accuracy and execution time compared to state-of-the-art methods.Item DIEDA: discriminative information based on exponential discriminant analysis combined with local features representation for face and kinship verification(Springer, 2018-01-30) Aliradi, Rachid; Belkhir, Abdelkader; Ouamane, Abdelmalik; Elmaghraby , Adel S.Face and kinship verification using facial images is a novel and challenging problem in computer vision. In this paper, we propose a new system that uses discriminative information, which is based on the exponential discriminant analysis (DIEDA) combined with multiple scale descriptors. The histograms of different patches are concatenated to form a high dimensional feature vector, which represents a specific descriptor at a given scale. The projected histograms for each zone use the cosine similarity metric to reduce the feature vector dimensionality. Lastly, zone scores corresponding to various descriptors at different scales are fused and verified by using a classifier. This paper exploits discriminative side information for face and kinship verification in the wild (image pairs are from the same person or not). To tackle this problem, we take examples of the face samples with unlabeled kin relations from the labeled face in the wild dataset as the reference set. We create an optimized function by minimizing the interclass samples (with a kin relation) and maximizing the neighboring interclass samples (without a kinship relation) with the DIEDA approach. Experimental results on three publicly available face and kinship datasets show the superior performance of the proposed system over other state-of-the-art techniques.