Browsing by Author "Hamed El-Bastawissy, Ali"
Now showing 1 - 5 of 5
- Results Per Page
- Sort Options
Item Benchmarking the Higher Education Institutions in Egypt using Composite Index Model(citseer, 2014) Rashad M El-Hefnawy, Mohamed; Hamed El-Bastawissy, Ali; Ahmed Kadry, MonaEgypt has the largest and most significant higher educationsystem in the Middle East and North Africa but ithad been continuously facing serious and accumulated challenges. The Higher Education Institutions in Egypt are undergoing important changes involving the development of performance, they are implementing strategies to enhance the overall performance of their universities using ICT, but still the gap between what is existing and what is supposed to be for theself-regulation and improvement processesis not entirely clearto face these challenges. The using ofstrategiccomparativeanalysis model and tools toevaluate thecurrent and future stateswill affect the overall performance of universities and shape new paradigms in development of Higher EducationSystem (HES), severalstudies have investigated the evaluation of universities through the development and use of ranking and benchmarksystemsItem Data Quality Based Data Integration Approach(World of Computer Science and Information TechnologyJournal (WCSIT), 2015) Samir Abdel-Moneim, Mohamed; Hamed El-Bastawissy, Ali; Hamed Kholief, MohamedData integration systems (DIS) are systems where query answers are collected from a set of heterogeneous and autonomous data sources. Data integration systems can improve results by detecting the quality of the data sources and retrieve answers from the significant ones only. The quality measures of the data in the data sources not only help in determining the significant data sources for a given querybut also help data integration systems produce results in a reasonable amount of time and with less errors.In this paper, we perform an experiment that shows a mechanism used to calculate and store a set of quality measures on data sources. The quality measures are,then,interactively used in selecting the most significant candidates of data sources to answerusers’queries. The justification and evaluations are done using amalgam and THALIAbenchmarks. We show that our approach dramatically improves query’s answersItem DRTX: A Duplicate Resolution Tool forXMLRepositories(International Journal of Computer Science and Network Security, 2012) Mohamed Abd El-ghfar, Randa; Hamed El-Bastawissy, AliDetecting duplicates in XMLis not trivial due to structural diversityand object dependency. This paper suggests a duplicate detection and resolution tool (DRTX) which is an efficient XMLduplicates detector and resolution that applies two famous techniques of duplicates detection, normal edit distance (NED) and token based damerau-levenshtein distance algorithm (TBED) thencompare the results and suggests thebetter similarity for each of them. DRTX is not only a duplicate detection and resolution system but it also provides two extra services: -first the XMLfile merger which is used to merge XMLdocuments thus solves the structure heterogeneity problem, second dirty XMLgenerator which is used to insert known duplicate problems on clean XMLfile to apply the mentioned algorithms on that file therefore explore how much the system can detect accurately these problems.Tominimize the number of pair-wise element duplicates comparison, a set of filters were used to increase the efficiency of DRTX while its effectiveness is adjustable. Experimental results show that there is no algorithm better than the other but each of them hasits own use ie.NED is better to use at lower threshold similarity values while TBED is better at higher onesItem Quality Driven Approach for Data Integration Systems(7th Int. Conf. Inf. Technol, 2015) Samir Abdel-Moneim, Mohamed; Hamed El-Bastawissy, Ali; Hamed Kholief, MohamedBy data integration systems (DIS) we mean the systems in which query answers are instantaneously mapped from a set of available data sources. The query answers may be improved by detecting the quality of the data sources and map answers from the significant ones only. The quality measures of the data in the data sources may help in determining the significant data sources for a given query. In this paper, we suggest a method to calculate and store a set of quality measures on data sources. The quality measures are, then, interactively used in selecting the most significant candidates of data sources to answer user queries. User queries may include the user preferences of quality issues. Quality-based approach becomes increasingly important in case of big number of data sources or when the user requires data with specific quality preferences.Item Transactions Management in Cloud Computing(Egyptian Computer Science Journal, 2014) Ali Abd-El Azim, Nesrine; Hamed El-Bastawissy, AliCloud computing has emerged as a successful paradigm for web application deployment. Economies-of-scale, elasticity, and pay-peruse pricing are the biggest promises of cloud. Database management systems serving these web applications form a critical component of the cloud environment. In order to serve thousands and a variety of applications and their huge amounts of data, these database management systems must not only scale-out to clusters of commodity servers, but also beself-managing, fault-tolerant, and highly available.In this paper we survey, analyze the currently applied transaction management techniquesand we propose a paradigm according to which, transaction management could be depicted and handled.