WSEAS CONFERENCES. WSEAS, Unifying the Science

Main Page of the Journal                                                           Main Page of the WSEAS


 Volume 9, 2010
Print ISSN: 1109-2750
E-ISSN: 2224-2872








Issue 1, Volume 9, January 2010

Title of the Paper: A New Semantic Similarity Measuring Method Based on Web Search Engines


Authors: Gang Lu, Peng Huang, Lijun He, Changyong Cu, Xiaobo Li

Abstract: Word semantic similarity measurement is a basic research area in the fields of natural language processing, intelligent retrieval, document clustering, document classification, automatic question answering, word sense disambiguation, machine translation, etc.. To address the issues existing in current approaches to word semantic similarity measurement, such as the low term coverage and difficult update, a novel word semantic similarity measurement method based on web search engines is proposed, which exploits the information, including page count and snippets, in retrieved results to do calculation. The proposed method can resolve the issues mentioned above due to the huge volumes of information in the Web. The experimental results demonstrate the effectiveness of the proposed methods.

Keywords: Web Search Engine, Semantic Similarity, Intelligent Retrieval, Word semantics, WordNet

Title of the Paper: A Dijkstra's Mobile Web Application Engine for Generating Integrated Light Rail Transit Route


Authors: A. M. Haziq Lim, N. S. Wan Sazli, B. Hussin, A. A. Azlianor, S. M. Suhaizan, K. Massila

Abstract: In the capital city of Malaysia, Integrated Light Rail Transit (LRT) System is one of the most importance public transportation as it connects some key districts to historical, interesting places, business areas and shopping malls are concentrated. The train services are running independently but have interchanges to integrate from one different LRT lines to the others. These interchanges could cause traveler a troublesome when they are choosing incorrect destination station especially on different LRT lines which contribute to time consuming and high costing. In previous research we are implemented the mobile web application using rules based algorithm where the destination-oriented routes need to be dynamically generated by determining the nearest station according to the specific places. Currently, our research is focusing on Dijkstra’s algorithms to provide more effective and intelligent shortest path system to provide the solution for traveler to reach the desired destination. In this paper, we discuss the results from our Dijkstra’s application prototype.

Keywords: Mobile application, Shortest path, Dijkstra, Dynamic route map

Title of the Paper: Usability User Testing of Selected Web-based GIS Applications


Authors: Jitka Komarkova, Martin Jedlicka, Miloslav Hub

Abstract: Web-based geographic information systems (WGIS) are focused on end-users who have only a very limited knowledge of GIS, if any. For this reason, WGIS applications have to be user-friendly. In the terms of a software quality, WGIS applications have to be usable. Lately, many various usability evaluation methods have been developed. Real users or their representatives are included in some of them. Other methods are based only on GIS experts’ evaluation of applications. Advantage of deploying real users or at least their representatives is clear – they are able to identify some usability problems which could not be identified by experts. The problem is that usability evaluation in a real situation done by real users is quite difficult. Experimental user usability testing done by representatives of users is the next possibility. In this case, artificial environment is prepared in a laboratory and the whole experiment must be precisely conducted and controlled. In all cases usability testing method must be precisely proposed and/or adapted for the given situation and conditions. Aim of the paper is to propose a suitable experimental method based on usability user testing to identify the most serious usability problems of 14 equal WGIS applications (all applications are run by the Czech Regional Authorities and they are targeted at citizens and other end users). The proposed method is qualitative, so its main aim is to identify the most serious usability problems of the evaluated applications. The proposed method, results of testing and identified problems are described in the end of the paper.

Keywords: Usability, User Testing, Web-based GIS

Title of the Paper: Coupling Metrics for Business Process Modeling


Authors: Wiem Khlif, Nahla Zaaboub, Hanene Ben-Abdallah

Abstract: Modeling business processes is vital when improving or automating existing business processes, documenting processes properly or comparing business processes. In addition, it is necessary to evaluate the quality of a business process model through a set of quality metrics. One specific categorie of such metrics is coupling which measures the functional and informational dependencies between the tasks/processes in a business process model. Our contribution in this paper consists in adapting object oriented software coupling metrics for business process models. This adaptation is based on correspondences we establish between concepts of the Business Process Modeling Notation and object oriented concepts. The new adapted coupling metrics offer more information about the dependencies among processes and their tasks in terms of data and control. They can be used, for instances to evaluate the transferability effects of errors occurring in a particular task/process. Finally, we validate theoretically the proposed metrics.

Keywords: Coupling quality metrics, coupling measurement, quality metrics, business process models

Title of the Paper: Measuring the Efficiency of Cloud Computing for E-learning Systems


Authors: Paul Pocatilu, Felician Alecu, Marius Vetrici

Abstract: As with rapid growth of the cloud computing architecture usage, more and more industries move their focus from investing into processing power to renting processing power from a specialized vendor. Education field is no different. E-learning systems usually require many hardware and software resources. There are numerous educational institutions that cannot afford such investments, and cloud computing is the best solution for them. The implementation of a cloud computing e-learning system has its peculiarities and needs a specific approach. This paper measures the positive impact of using cloud computing architectures upon e-learning solutions development. We advance a set of cloud computing efficiency metrics for enhanced e-learning implementation process control. Also, the long term overall efficiency of the cloud computing usage in the field of e-learning system is evaluated.

Keywords: Cloud computing, E-learning, Mobile learning, Project management, Paretto Principle

Title of the Paper: Structures used in Secure Automatic Ticketing System


Authors: Marius Popa, Cristian Toma

Abstract: The paper presents a solution for an automatic ticketing system and some aspects regarding security assurance of this kind of system. Some concepts and terms used in development of secure automatic ticketing system are presented. It is depicted an architecture o secure automatic ticketing system with its components and their roles in this architecture. Also, there are highlighted technical details of the cards used in implementation of secure automatic ticketing system.

Keywords: Ticketing System, Distributed Informatics System, Informatics Security, Smart Card

Title of the Paper: Identity based Threshold Cryptography and Blind Signatures for Electronic Voting


Authors: Gina Gallegos-Garcia, Roberto Gomez-Cardenas, Gonzalo I. Duchen-Sanchez

Abstract: Recently, there has been an increasing interest to improve the efficiency in election processes which has brought as a consequence a wide range of proposals for electronic voting. Electronic voting protocols are a reasonable alternative to conventional elections. Nevertheless, they are facing an evolution due to its requirements, especially the ones needed to provide full security considered to represent a democratic electronic vote. In the literature, different protocols based on public key schemes have been proposed to meet such security requirements. In this paper, we propose the use of bilinear pairings in order to provide the security requirements that an electronic voting protocol must meet, without requiring the entire infrastructure needed in a public key scheme. Proposed protocol considers two cryptographic primitives as main building blocks: threshold and blind signature schemes. It is divided in four main stages: set-up, authentication, voting and counting. Moreover, it meets privacy, accuracy and robustness by using bilinear pairings. We make a comparative analysis, which is based on its performance and the key pairs, Trust and Certification Authorities it requires.

Keywords: Bilinear pairings, Blind signatures, Electronic voting protocols, Identity based cryptography, Public key cryptography, Security requirements, Threshold cryptography

Title of the Paper: A New Immune Algorithm and its Application


Authors: Guo Meng, Chen Qiuhong

Abstract: The traditional single clonal selection algorithm has a lot of disadvantages, for example, it is easy to be trapped into local optima, and it has a lot of massive redudacy iteration in its later period and inferior global search ability and so on. In this paper, a new artificial immune algorithm is proposed based on the clonal selection theory and the structure of anti-idiotype(IAAI), which is improved as follows: firstly, IAAI constructs a dynamic clonal expansion, secondly clonal mutation, thirdly dislocated line clonal recombinant, lastly clonal selection. Through the above mentioned several key steps, IAAI is improved in order to achieve the evolution of the whole antibody population, then the new algorithm can have strong search capabilities which make it reach better performance by perfoming global search and local search in many directions in the solution space. Then the global convergence of the new algorithm is analyzed from the test of several typical complex functions. The result shows that the algorithm can effectively overcome the premature problem, and improve the ability of global optimization, the speed of convergence.

Keywords: Immune algorithm, clonal selection, dynamic clonal expansion, clonal mutation, dynamic adjustment, anti-idiotype

Title of the Paper: Comparison of Different Topologies for Island-Based Multi-Colony Ant Algorithms for the Minimum Weight Vertex Cover Problem


Authors: Raka Jovanovic, Milan Tuba, Dana Simian

Abstract: The aim of this paper is compare the effect of using different topologies or connections between separate colonies in island based parallel implementations of the Ant Colony Optimization applied to the Minimum Weight Vertex Cover Problem. We investigated the sequential Ant Colony Optimization algorithms applied to the Minimum Weight Vertex Cover Problem before. Parallelization of population based algorithms using the island model is of great importance because it often gives super linear increase in performance. We observe the behavior of different parallel algorithms corresponding to several topologies and communication rules like fully connected, replace worst, ring and independent parallel runs. We also propose a variation of the algorithm corresponding to the ring topology that maintains the diversity of the search, but still moves to areas with better solutions and gives slightly better results even on a single processor with threads.

Keywords: Ant colony optimization, Minimum weight vertex cover problem, Parallel computing, Combinatorial optimization, Evolutionary computing

Title of the Paper: Research on the Evaluation Methods of Bid of Construction Project Based on Improved BP Neural Network


Authors: Jie Yang, Hongjian Qu, Liang Zhou

Abstract: Bid and tender is the core stage in construction project, however, many evaluation factors can be handled easily by qualitative analysis, which is hard to avoid subjectivity. In this paper, a new evaluation index system is proposed to use quantitative analysis in order to avoid error which is induced by subjectivity, firstly. Secondly, an improved BP neural network is proposed as an evaluation method. The results indicated that the process of evaluation is simple and practical, and the accuracy can satisfy the actual requirement based on this new evaluation index system and the improved BP neural network.

Keywords: Decision processes, construction project, BP neural network, Bid, tender, evaluation index system

Issue 2, Volume 9, February 2010

Title of the Paper: Study on the Dual Tender Offer Information Leakage: Based on Residual Error Ratio Model


Authors: Zhang Yi, Qu Hongjian, Qi Yuan, Xu Qifan

Abstract: Information leakage, generated inside trading and stock price manipulation of listed corporations, has highlighted in the academic research, and has also been terribly serious on basis of most conclusions. Therefore, new academics significance will be back to the study on finding some more effective means to solve these problems. While “dual tender offer” is a characteristic feature of Chinese capital market as the transition of market-oriented mergers and acquisitions, it has very important significance for the study of “dual tender offer” information leakage. From the traditional perspective of stock price abnormal fluctuation, this paper adopts a new theoretical approach –“residual error ratio model” to test the stock price performances before the first annoucement of “dual tender offer” information and indicate the result that the method is convenient and practical, and can also make up for the limitation that “temporary suspension system” discovers abnormal fluctuation of share price.

Keywords: Information Leakage, Stock Price Abnormal Fluctuation, Residual Error Ratio Model, Dual Tender Offer

Title of the Paper: Data Mining Based on Rough Sets in Risk Decision-making: Foundation and Application


Authors: Li Wanqing, Ma Lihua, Wei Dong

Abstract: In order to solve the problem of the redundant information to distinguish in the risk decision-making, in this paper, the data mining algorithms based on Rough Sets is studied. And we know the risk decision-making is an important aspect in the management practice. In the risk decision process of a project decision-making, it is necessary to use the algorithm to discover valuable knowledge and make a right decision. In the paper, a data mining method called Rough Sets is introduced in the field. And the algorithmic process of data mining based on Rough Set is studied. According to the Rough Sets theory, firstly, the factors set is established including condition attribute and decision attribute. Secondly, experts qualitatively describe risk factors and establish a decision database, called decision table. Thirdly, the attribute reduction algorithm based on Rough Sets is used to eliminate the redundant risk factor and its value of decision table. Fourthly, the minimum decision rules are abstracted based on data mining technology. Finally, the process of risk decision based on data mining of Rough Sets is analyzed in a case study.

Keywords: Data mining, Rough Sets, minimum decision rule, attribute reduction, risk decision, project decision-making

Title of the Paper: A New Model for Solving Portfolio Selections Based on Fuzzy Goals of Investors


Authors: Jie Yang, Hongjian Qu, Hejing Ge, Bai Xiaojuan

Abstract: Since the 1960s, lots of scholars had begun to research in the portfolio selections based on the theory of mean-variance of Markowitz portfolio and relevant methods. All of these studies are under certainly of the assumption term, and then the researchers can get efficient set of portfolio selection. However, alone with the finance environment increasing of complexity, it is becoming more complex to build a model of portfolio selection, which requires further inquiries on its part based on the fact. In this paper, a new model of portfolio selection is proposed, and it is based on the fuzzy goals between risk and return which is made up of subjective measure factors. These factors can represent the investors’ subjective thinking, and they are given before building the model of portfolio selection. Therefore, this goal will not change with the change of the portfolio strategy. The theory and empirical studies show that the new model is close to the practical finance environment and it simplify the process of solving and be valuable.

Keywords: Markowitz Portfolio Selection Model, fuzzy goals, subjective measure factors, subjective risk, subjective return

Title of the Paper: Social Network Sites and Protection of Children: Regulatory Framework in Malaysia, Spain and Australia


Authors: Jawahitha Sarabdeen, Maria De-Miguel-Molina

Abstract: The open nature of the social network sites facilitates many opportunities for children but also makes them vulnerable for abuses from various parties. Obscenity, hate speech, and indecent contents that are not suitable for children are very common in the social network sites. The Malaysian, Spanish and Australian government regulate these contents as they regulate the contents in other traditional mass media. For the purpose of regulatory compliance most social networks do not allow children under 13-14 to access their services. However, the technology that controls this restriction can easily be evaded and the service providers are still uncertain how to label contents appropriate to child access. Both Governments and corporations agree that control is insufficient and so companies embark on self-regulation of themselves through Codes of Conduct. The objective of this paper is to compare how far the regulation and self-regulation protect children in social networks sites and what need to be done to improve the effectiveness of regulation. The paper compares social networks in Malaysia, Spain and Australia to find strengths and opportunities that could enrich regulation of social networks in those countries.

Keywords: Online, protection, children, social network, regulation, data privacy

Title of the Paper: A Novel Meta Predictor Design for Hybrid Branch Prediction


Authors: Young Jung Ahn, Dae Yon Hwang, Yong Suk Lee, Jin-Young Choi, Gyungho Lee

Abstract: Recent systems have been paved the way for being high-performance due to the super-pipelining, dynamic scheduling and superscalar processor technologies. The performance of the system is greatly affected by the accuracy of the branch prediction because the overhead of each misprediction has grown due to greater number of instructions per cycle and the deepened pipeline. Hybrid branch prediction is usually used to increase the prediction accuracy on such high-performance systems. Normally hybrid branch prediction uses several branch predictors. A meta-predictor selects which branch predictor should be used corresponding to the program context of the branch instruction instance for the branch prediction. In this paper, we discuss about the saturating counter within meta predictor. The design of the saturating counter which selects a predictor that has high-prediction ratio has brought out the high accuracy of the prediction for the branch predictor.

Keywords: Branch Prediction, Saturating Counter, Prediction Accuracy, Hybrid Branch Predictor, Meta Predictor

Title of the Paper: Adding Semantics to Software-as-a-Service and Cloud Computing


Authors: Francisco Garcia-Sanchez, Eneko Fernandez-Breis, Rafael Valencia-Garcia, Enrique Jimenez, Juan M. Gomez, Javier Torres-Nino, Daniel Martinez-Maqueda

Abstract: The Web is evolving from a mere repository of information to a new platform for business transactions and information interchange. Large organizations are increasingly exposing their Business Processes throughWeb Services Technology for the large-scale development of software, as well as for the sharing of their services within and outside the organization. New paradigms for Software and Services Engineering such as the Software-asa-Service (SaaS) and the cloud computing model promise to create new levels of efficiency through large-scale sharing of functionality and computing resources. However, there are few academic works in this area and many gaps are still open. In this paper, we present a Business Process based on Semantics platform where services are executed from a SaaS perspective. The SITIO platform allows external developers to create add-on applications that integrate into the main SITIO application and are hosted on Cloud Computing infrastructure.

Keywords: Cloud Computing, Software-as-a-Service, Business Process Management, Semantic Technology

Title of the Paper: Prostate Cancer Prognosis Evaluation Assisted by Neural Networks


Authors: Corina Botoca, Razvan Bardan, Mircea Botoca, Florin Alexa

Abstract: Neural networks (NN) are new promising tools that can assist the clinicians in the diagnosis process and in therapy decision making, because they can deal with a great number of parameters, learning from examples and assessing any nonlinear relationships between inputs and outputs. In this paper, the problem of prostate cancer evolution prediction is approached using NN. The original database contained 650 records of patients, which underwent radical prostatectomy for prostate cancer. The NN variables were the parameters with the highest prognostic value selected and pre-processed from the original database. Different NN architectures and NN parameters have been tested in order to obtain the best complexity/accuracy ratio. The input data were structured, according to the latest statistical and representation concepts used in the current medical practice, aiming to improve the global performance. Different experiments were done using the rough database and the structured database. The NN performances were compared with the most widely used prediction statistical method, the logistic regresion. All NN models performed better than the logistic regression. The best obtained global prediction of correct classification 96.94% is better than the results of similar experiments available in literature. The NN prediction performance might be improved, because, in our opinion, its limits are given by the relatively small number of cases and the methods of collecting data.

Keywords: Neural networks, prostate cancer, prediction, capsule penetration

Title of the Paper: Using Parallel Signal Processing in Real-Time Audio Matrix Systems


Authors: Jiri Schimmel

Abstract: The paper deals with design and performance analysis of algorithms that utilize parallel signal-processing methods and SIMD technology for multiply-and-add algorithm for digital audio signal processing. This algorithm is used for summing the gained input signals on output buses in applications for distributing, mixing, effect-processing, and switching multi-format digital audio signal in an audio signal network on desktop processors platforms. The subjective evaluation of latency caused by principle of the real-time digital audio processing is also studied in the paper Results of an analysis of speed-up and real-time performance of several summing algorithms are presented in the paper as well as subjective evaluation of the latency depending on the audio buffer size.

Keywords: Parallel processing, Parallel algorithms, Audio systems, Optimization methods, SIMD, Digital audio processing

Title of the Paper: Applications of Virtual Reality for Visually Impaired People


Authors: Torres-Gil, M. A., Casanova-Gonzalez, O., Gonzalez-Mora, J. L.

Abstract: This paper describes in detail the development and applications of a Virtual Reality Simulator for Visually Impaired People. It makes an auditory representation of the virtual environment, rendering the virtual world entirely through the hearing. The simulator has these main purposes: validation of auditory representation techniques, 3d sensor emulation for environment recognition and hardware integration, training of visually impaired users with these new auditory representation, and acoustic perception experiments aimed to improve the auditory rendering. The interaction with the simulator is made by a 3d tracking system to locate user’s head orientation and position. This means the user interaction is as natural as possible, all performed by just “walking through” the environment, and at the same time, the user perceives the environment through acoustic information.

Keywords: Virtual reality, visually impaired, spatial sound, electromagnetic position tracker

Title of the Paper: Automated Quantitative Assessment of Perifollicular Vascularization Using Power Doppler Ultrasound Images


Authors: Boris Cigale, Smiljan Sinjur, Damjan Zazula

Abstract: In this paper a prototype of automated quantitative assessment of perifollicular vascularisation is described. Assessment of perifollicular vascularisation is important in the research, perfomed by medical team at the Teaching Hospital of Maribor, if the application of hormonal therapy after follicle puncture in natural cycles is really always needed. The proposed algorithm works with 3D power Doppler ultrasound images and consists of several steps. At the first step the position and shape of the dominant follicle is determined. The procedure based on the continuous wavelet transform is utilized. Then vessels contained in 5 mm thick layer around the follicle are categorized according to their diameter. The vessel thickness at certain point is defined as diameter of the largest sphere which includes points and fits entirely inside vessels. The results are statistically evaluated by the histograms of vessel diameters. To improve the visual results the vessel reconstruction, based on minimal spanning trees, is done at the end.

Keywords: 3D ultrasound image segmentation, vessel thickness assessment, vessel reconstruction

Issue 3, Volume 9, March 2010

Title of the Paper: Feature Selection of RAPD Haplotypes for Identifying Peach Palm (Bactris Gasipaes) Landraces using SVM


Authors: Jose Luis Vasquez, Javier Vasquez, Juan Carlos Briceno, Elena Castillo, Carlos M. Travieso

Abstract: This present work presents a robust system for the feature reduction, using Deoxyribonucleic Acid (DNA) primer. This system reaches up to 100% classes identification based on Support Vector Machines (SVM). In particular, the biochemical parameterization has 89 Random Amplified polymorphic DNA (RADP) primers of Pejibaye Palm races, and it has been reduced to 10 RADP primers. The development of this application provides economic and computational advantages. When it is reduced the number of primers, this application reduces the economic cost, being a process so much cheaper, up to 11.24% from the initial process. On the other hand, the use of our supervised classification system is faster in order to do a method of origin denomination plant certification, due to reduce the dataset up to 11.24%.

Keywords: Dimensionality Reduction, feature selection, DNA analysis, supervised classification, SVM, Artificial Neuronal Network, Cluster analysis

Title of the Paper: Traveling Wave Solutions for the Generalized Burgers Equation and the (2+1) Dimensional Dispersive Equation By (G'/G)-Expansion Method


Authors: Qinghua Feng, Bin Zheng

Abstract: In this paper, we test the validity and reliability of the (G'/G)-expansion method by applying it to get the exact travelling wave solutions of the generalized Burgers equation and the (2+1) dimensional dispersive equation. The traveling wave solutions are obtained in three forms. Being concise and less restrictive, the method can also be applied to many other nonlinear partial differential equations.

Keywords: (G'/G )-expansion method, Traveling wave solutions, generalized Burgers equation, (2+1) dimensional dispersive equation, exact solution, evolution equation, nonlinear equation

Title of the Paper: Traveling Wave Solutions for Three Non-linear Equations By (G'/G)-Expansion Method


Authors: Qinghua Feng, Bin Zheng

Abstract: In this paper, we will try to obtain the new exact solutions of the DSSH equation, the KP-BBM equation and the (3+1) dimensional potential-YTSF equation. The three nonlinear equations are reduced to nonlinear ordinary differential equations (ODE) by using a simple transformation respectively. Then we construct the traveling wave solutions of the equations in terms of the hyperbolic functions, trigonometric functions and the rational functions by the (G'/G )-expansion method.

Keywords: (G'/G )-expansion method, Traveling wave solutions, DSSH equation, KP-BBM equation, (3+1) dimensional potential-YTSF equation, exact solution, evolution equation, nonlinear equation

Title of the Paper: Integrating Weighted LCS and SVM for 3D Handwriting Recognition on Handheld Devices using Accelerometers


Authors: Wang-Hsin Hsu, Yi-Yuan Chiang, Jung-Shyr Wu

Abstract: Based on accelerometer, we propose a 3D handwriting recognition system in this paper. The system is consists of 4 main parts: (1) data collection: a single tri-axis accelerometer is mounted on a handheld device to collect different handwriting data. A set of key patterns have to be written using the handheld device several times for consequential processing and training. (2) data preprocessing: time series are mapped into eight octant of three-dimensional Euclidean coordinate system. (3) data training: weighted LCS and SVM are combined to perform the classification task. (4) pattern recognition: using the trained SVM model to carry out the prediction task. To evaluate the performance of our handwriting recognition model, we choose the experiment of recognizing a set of English words. The accuracy of classification could be achieved at about 96.85%.

Keywords: Accelerometer, gesture recognition, handwriting recognition, LCS, SVM

Title of the Paper: The Use of Radon Transform in Handwritten Arabic (Indian) Numerals Recognition


Authors: Sabri A. Mahmoud, Marwan H. Abu-Amara

Abstract: This paper describes a technique for the recognition of off-line handwritten Arabic (Indian) numerals using Radon and Fourier Transforms. Radon-Fourier-based features are used to represent Arabic digits. Nearest Mean Classifier (NMC), K-Nearest Neighbor Classifier (K-NNC), and Hidden Markov Models Classifier (HMMC) are used. Analysis using different number of projections, varying the number of Radonbased features, and the number of samples used in the training and testing of this technique is presented using the NMC and K-NNC. A database of 44 writers with 48 samples per digit each totaling 21120 samples are used for training and testing of this technique. The training and testing of the HMMC is different than that of the NMC and KNNC in its internal working and in the way data is presented to the classifier. Since the digits have equal probability the randomization of the digits is necessary in the training of the HMMC. 80% of the data was used in training and the remaining 20% in testing of the HMMC. Radon-based features are extracted from Arabic numerals and used in training and testing of the HMM. In this work we didn’t follow the general trend, in HMMC, of using sliding windows in the direction of the writing line to generate features. Instead we generated features based on the digit as a unit. Several experiments were conducted for estimating the suitable number of states for the HMM. In addition, we experimented with different number of observations per digit. The Radon-Fourier-based features proved to be simple and effective. The classification errors were analyzed. The majority of errors were due to the misclassification of digit 7 with 8 and vice versa. Hence, a second Structural Classifier is used in a cascaded (second) stage for the NMC, K-NNC, and HMMC. This stage, which is based on the structural attributes of the digits, enhanced the average overall recognition rate from 3.1% to 4.05% (Recognition rates of 98.66%, 98.33%, 97.1% for NMC, K-NNC, HMMC, respectively).

Keywords: Arabic numeral recognition, OCR, Hidden Markov Models, Handwritten Digit recognition, Nearest neighbor classifier

Title of the Paper: WxShapeFramework: An Easy Way for Diagrams Manipulation in C++ Applications


Authors: Michal Bliznak, Tomas Dulik, Vladimir Vasek

Abstract: wxShapeFramework is new cross-platform software library written in C++ programming language which is suitable for creation of software applications manipulating diagrams, images and other graphical objects. Thanks to the underlying technologies such as wxWidgets toolkit and its XML-based persistent data container add-on called wxXmlSerializer it is an ideal solution for rapid and easy cross-platform visualisation software development. The paper reveals how the wxSF allows user to easily create applications able to interactively handle various scenes consisting of pre-defined or user-defined graphic objects (both vector- and bitmap-based) or GUI controls, store them to XML files, export them to bitmap images, print them etc. Moreover, thanks to applied software licence the library can be used for both open-source and commercial projects on all main target platforms including MS Windows, MacOS and Linux.

Keywords: Diagram, vector, bitmap, GUI, wxWidgets, wxXmlSerializer, wxShapeFramework, wxSF, C++

Title of the Paper: Interactive Compression of Books


Authors: Bruno Carpentieri

Abstract: In this paper we study interactive data compression and present experimental results on the interactive compression of textual data (books or electronic newspapers) in Italian or English language. The main intuition is that when we have already compressed a large number of similar texts in the past, then we can use this previous knowledge of the emitting source to increase the compression of the current text and we can design algorithms that efficiently compress and decompress given this previous knowledge. By doing this in the fundamental source coding theorem we substitute entropy with conditional entropy and we have a new theoretical limit that allows for better compression. Moreover, if we assume the possibility of interaction between the compressor and the decompressor then we can exploit the previous knowledge they have of the source. The price we pay is a very low possibility of communication errors.

Keywords: Data Compression, Interaction, Dictionary based compression, Fingerprinting

Title of the Paper: CROVALLEX Lexicon Improvements: Subcategorization and Semantic Constraints


Authors: Nives Mikelic Preradovic

Abstract: The paper describes the Croatian valence verb lexicon (CROVALLEX) that contains information on syntactic subcategorization and semantic restrictions of 1739 most frequent Croatian verbs. These 1739 verbs are associated with 5118 valence frames and enriched with 72 broad semantic classes with two further levels of subdivision (173 classes in total). The evaluation shows that syntacto-semantic verb classification helps in capturing the relation between the syntax and semantics of Croatian verbs and therefore reduces the redundancy in the lexicon. Unfortunately, classes in the current version of CROVALLEX do not provide a means for full inference of the verb semantics on the basis of its syntactic behavior. Therefore, in the improved version we plan to introduce the more distinctive semantic roles. In the improved version of CROVALLEX the semantic typing will be based on EuroWordNet Top Ontology. We believe that with such improvements we can solve the problem of sense differentiability and get a finer grained semantic classification of verbs in Croatian language.

Keywords: Croatian verb valence lexicon, Valence frames, Syntacto-semantic classes, Verb synsets

Title of the Paper: Enhancing Enterprise Service Bus Capability for Load Balancing


Authors: Aimrudee Jongtaveesataporn, Shingo Takada

Abstract: ESB is a core middleware technology which can support the integration of services according to the Service Oriented Architecture. A major responsibility of ESB is to route messages to heterogeneous services. However, conventional ESBs support only static routing, i.e. the service to which a message is sent must be fixed a priori. Thus, even if there are many services that can satisfy the same request, the request is always sent to the same service without considering the service status, e.g., load, at that time. This situation may lead to a low throughput performance on the service side and low satisfaction on the consumer side. This paper aims to enhance the ESB capability by supporting load balancing. Our approach focuses on balancing among a group of different services with the same function. We introduce the concept of service type and show the results of an experiment.

Keywords: ESB, Middleware Message Balancing, Web Services, Load Balancing, SOA

Title of the Paper: Implementing Time Series Identification Methodology Using Wireless Sensor Networks


Authors: Daniel-Ioan Curiac, Ovidiu Banias, Constantin Volosencu

Abstract: Wireless sensor networks being a collection of numerous sensor nodes, each with sensing (temperature, humidity, sound level, light intensity, magnetism, etc.) and wireless communication capabilities, provide huge opportunities for monitoring and mathematical modeling of the time-evolution of the physical quantities under investigation. Starting from the measurements collected by the sensor nodes inside an investigated spatial distributed system, this paper offers an efficient methodology to identify time series.

Keywords: Time series, system identification, sensor networks, interpolation

Issue 4, Volume 9, April 2010

Title of the Paper: Synchronized Alternating Turing Machines on Four-Dimensional Input Tapes


Authors: Makoto Sakamoto, Tomoya Matsukawa, Ryoju Katamune, Hiroshi Furutani, Michio Kono, Satoshi Ikeda, Takao Ito, Yasuo Uchida, Tsunehiro Yoshinaga

Abstract: Synchronized alternating machine is an alternating machine with a special subset of internal states called synchronizing states. This paper introduces a four-dimensional synchronized alternating Turing machine (4-SATM), and investigates fundamental properties of 4-SATM’s. The main topics of this paper are: (1) a relationship between the accepting powers of 4-SATM’s and four-dimensional alternating Turing machines with small space bounds, (2) a relationship between the accepting powers of seven-way and eight-way 4-SATM’s, (3) a relationship between the accepting powers of 4-SATM’s and four-dimensional nondeterministic Turing machines. In this paper, we let each sidelength of each input tape of these automata be equivalent in order to increase the theoretical interest.

Keywords: Alternation, computational complexity, configuration, four-dimensional Turing machine, synchronization

Title of the Paper: Hardware Hierarchies and Recognizabilities of Four-Dimensional Synchronized Alternating Turing Machines


Authors: Makoto Sakamoto, Ryoju Katamune, Tomoya Matsukawa, Hiroshi Furutani, Michio Kono, Satoshi Ikeda, Takao Ito, Yasuo Uchida, Tsunehiro Yoshinaga

Abstract: The recent advances in computer animation, motion image processing, robotics and so on prompted us to analyze computational complexity of four-dimensional pattern processing. Thus, the research of four-dimensional automata as a computational model of four-dimensional pattern processing has also been meaningful. From this viewpoint, we introduced a four-dimensional alternating Turing machine (4-ATM) operating in parallel. In this paper, we continue the investigations about 4-ATM’s, deal with a four-dimensional synchronized alternating Turing machine (4-SATM), and investigate some properties of 4-SATM’s which each sidelength of each input tape is equivalent. The main topics of this paper are: (1) hierarchies based on the number of processes of 4-SATM’s, and (2) recognizability of connected pictures by 4-SATM’s.

Keywords: Alternation, four-dimensional Turing machine, hierarchy, recognizability, synchronization

Title of the Paper: Novel Models for Multi-Agent Negotiation based Semantic Web Service Composition


Authors: Sandeep Kumar, Nikos E. Mastorakis

Abstract: Multi-agent based semantic web service composition involves the composition of semantic web services considering each of the agent capability to serve a particular service request. This paper presents the two variations of semantic web service composition process based on the timing of negotiation in composition process. The variations are based on the concept that the negotiation between the service requester and the service providers has been performed before the selection of final service provider or after the final service provider has been selected. Further, based upon one of the model, a novel multi-agent based semantic web service composition approach has been presented.

Keywords: Agent, composition, negotiation, semantic web

Title of the Paper: A Scheme for Salt and Pepper Noise Reduction and its Application for OCR Systems


Authors: Nucharee Premchaiswadi, Sukanya Yimgnagm, Wichian Premchaiswadi

Abstract: This paper presents an algorithm for Salt and Pepper noise reduction which can be applied to binary, gray scale, and color formatted documents. The scheme combines the characteristics of the Applied kFill Algorithm and Median Filter Algorithm by using window sizes of 3x3 and 5x5, depending on the size of the Salt and Pepper noise. The goal of this technique is to increase the PSNR of picture images and improve the quality for scanning documents when using an optical character recognition (OCR) system. The experimental results show that the proposed scheme can remove Salt and Pepper noise better than the Applied kFill Algorithm and Median Filter Algorithm and can significantly improve the recognition accuracy of an optical character recognition (OCR) system.

Keywords: Applied kFill, kFill Algorithm, Image Processing, Median Filter, Noise Reduction, OCR systems, Salt and Pepper Noise

Title of the Paper: Improving Performance in Adaptive Fault Tolerance Structure with Investigating the Effect of the Number of Replication


Authors: Negar Mosharraf, Mohammad Reza Khayyambashi

Abstract: Regarding the wide use of distributed systems in various areas, having a system with fault tolerance ability would be an import characteristic. And in designing the real time distributed systems, this seems to be more considerable. With regard using some middleware like CORBA in designing such systems, and in order to increase their compatibility, speed, performance, to simplify the network programs and other characteristics there is no supporting program to have distributed real time system and fault tolerance at the same time. In fact, adaptive means taking into account the properties of both structures so that the requirements of these two structures are met during performance and this is usually created by a trade off between specifications of both real time and fault tolerance systems. In this study, the FT-CORBA structure as a structure used for supporting fault tolerance programs as well as relative important parameters including replication style and number of replica, which play further role in improved performance and make it adaptive to real time distributed system have been reviewed. Studying these specifications a structure adaptive to real time systems with higher performance than FT-CORBA structure have been made and, finally, the implementation of the said structure and determination of the number of replica and the objects replication style as well as the significance of related parameters have been investigated.

Keywords: Middleware, Fault Tolerant, Trade-offs, Replication, Real time

Title of the Paper: Use of GA based Approach for Engineering Design through WWW


Authors: Shu-Tan Hsieh

Abstract: Today’s industrial environment requires engineering design to be achieved by geographically distributed engineering teams who may work on different computer platforms, so the analogy can be presented as the distributed constraint optimization problems. This paper presents an agile approach that carries out a concurrent optimization of a product design and its associate constraint satisfaction in manufacturing perspective. Also, the approach has been implemented through the World Wide Web (WWW) regardless of the geographical constraints and different platforms used. In this paper, the hybrid evolution computation (EC) approaches combing genetic algorithm and stochastic annealing algorithms are applied to find optimal or near optimal solutions for two engineering design cases. The main contribution of this paper is to provide an agile approach for solving the engineering design problem which is modeled by the nonlinear programming model, and the approach is implemented through the WWW regardless of the geographical constraints and different platforms used. Experimental results are presented to exhibit the superior performance of the proposed methodology.

Keywords: Evolutionary computation, genetic algorithm, stochastic annealing, nonlinear programming, world wide web

Title of the Paper: A Study of Computer and Information Course Curricula for the General Education in Taiwan University


Authors: Lung-Hsing Kuo

Abstract: The contemporary university education should not only be able to help students develop the professional knowledge or skills, but need to consider how it can really be for the country, develop a rational thinking, human literacy, and social care professionals. The study focused on the numbers and social network analysis of "Computer and Information Course" for general education curriculum, in order to understand the current Taiwan higher education institutions. In this study we use university school curriculum resource network ( to obtain sample resources and use, and the software UCINET6.198 to analyze the social network variables to define the network position.

Keywords: Social network, Computer and Information Course, General Education, UCINET

Title of the Paper: Rethinking Database Updates using a Multiple Assignment-based Approach


Authors: Elizabeth Hudnott, Jane Sinclair, Hugh Darwen

Abstract: We investigate the problems involved in efficient implementation of multiple assignment to database tables, as suggested by Date and Darwen in their Third Manifesto proposal for future database systems [10]. We explain the connection between assignment and the insert, delete and update operations and why multiple assignments executed simultaneously are preferable to deferred constraint checking. Our contributions are twofold. Firstly, we enable the user to specify updates either in terms of the changes needed to the existing state or as the final table contents directly, with no degradation in performance. Secondly, when multiple tables are updated SQL places the responsibility on the user to order the update statements correctly. Integrity constraints must either be preserved in the unnecessary intermediate states or else deferred. Multiple assignment accepts updates across the entire database simultaneously and makes the system responsible for scheduling them correctly. We present methods for a proposed implementation that can potentially exceed the performance of SQL DBMSs by employing parallelism and multi-query optimization.

Keywords: Constraints, Multiple assignment, Multi-query optimization, Parallel updates, Query independence, Simultaneous assignment

Title of the Paper: Integrating Machine Learning in Intelligent Bioinformatics


Authors: Aboubekeur Hamdi-Cherif

Abstract: Machine learning is the adaptive process that makes computers improve from experience, by example, and by analogy. Learning capabilities are essential for automatically enhancing the performance of a computational system over time on the basis of previous history. Bioinformatics is the interdisciplinary science of interpreting biological data using information technology and computer science. The field of bioinformatics main objective is to develop relevant computational systems for biological purposes. In this paper, we study how machine learning can help in developing better bioinformatics methods and tools in a coherent manner. We attempt to integrate the multitude of existing methods and tools in a unifying framework as a prelude to showing how machine learning can uncover even more useful structures hidden in biological sequences.

Keywords: Intelligent Bioinformatics, Machine learning, Soft computing, Data mining, Grammatical inference

Issue 5, Volume 9, May 2010

Title of the Paper: TAR Based Shape Features in Unconstrained Handwritten Digit Recognition


Authors: P. Ahamed, Yousef Al-Ohali

Abstract: In this research, the recognition accuracy of triangle-area representation (TAR) based shape feature is measured in recognizing the totally unconstrained handwritten digits. The TAR features for different triangles of variable side lengths that are formed by taking the combinations of different contour points were computed. The set of contour points that yielded the best features was experimentally discovered. For classification a curve matching technique is used. Several experiments were conducted on real-life sample data that was collected from postal zip codes written by mail writers. The highest recognition result of 98.5 % was achieved on the training data set and 98.3% on the test data set.

Keywords: Triangle area representation, Shape descriptors, Digit recognition, Contour points, and zip codes

Title of the Paper: A Dynamic Dataflow Architecture using Partial Reconfigurable Hardware as an Option for Multiple Cores


Authors: Jorge Luiz E. Silva, Joelmir Jose Lopes

Abstract: Different from traditional processors, Moore/s Law was one of the reasons to duplicate cores, and at least until today it is the solution for safe consumption and operation of systems using millions of transistors. In terms of software, parallelism will be a tendency over the coming years. One of the challenges is to create tools for programmers who use HLL (High Level Language) producing hardware directly. These tools should use the utmost experience of the programmers and the flexibility of FPGA (Field Programmable Gate Array). The main aspect of the existing tools which directly convert HLL into hardware is dependence graphics. On the other hand, a dynamic dataflow architecture has implicit parallelism. ChipCflow is a tool to convert C directly into hardware that uses FPGA as a partial reconfiguration based on a dynamic dataflow architecture. In this paper, the relation between traditional dataflow architecture and contemporary architecture, as well as the main characteristics of the ChipCflow project will be presented.

Keywords: Dataflow Architecture; Reconfigurable Hardware; Tagged-token; Run-time Reconfiguration; Protocol for dataflow

Title of the Paper: Fault-Tolerant Meshes and Tori Embedded in a Faulty Supercube


Authors: Jen-Chih Lin, Shih-Jung Wu, Huan-Chao Keh, Lu Wang

Abstract: Hypercubes, meshes, and tori are well known interconnection networks for parallel computing. The Supercube network is a generalization of the hypercube. The main advantage of this network is that it has the same connectivity and diameter as that of the hypercube without the constraint that the number of nodes be a power of 2. This paper proposes novel algorithms of fault-tolerant meshes and tori embedded in supercubes with node failures. The main results obtained (1) a replacing sequence of a supercube is including approximate to (2logN????+1) nodes. Therefore, there are O() faults, which can be tolerated. (2) The result implies that optimal simulation of mesh and torus in a faulty supercube for balancing the processor and communication link loads at present. According to the result, we can easily port the parallel or distributed algorithms developed for these structures to the supercubes. Therefore, these methods of reconfiguring enable extremely high-speed parallel computation.

Keywords: Fault-tolerant, mesh, tori, graph embedding, supercube

Title of the Paper: A Novel Object Detection Approach Based on the Boundary Shape Information from High Resolution Satellite Imagery


Authors: Xiaoshu Si, Xuemin Hu, Hong Zheng

Abstract: This paper presents a novel approach of detecting special objects from high resolution satellite imagery. In this approach, a bilateral filtering is used to denoise firstly, and a new morphological approach which combines gray scale morphological processing and binary morphological processing is proposed for ROI extraction and feature enhancement. A detection operator based on the boundary shape information (BSI) is developed to detect enhanced objects. The experiments on images from Google Earth are discussed in the paper. The experimental results show that proposed approach is effective and feasible. Compared with other object detection approaches from high resolution satellite imagery such as PCA or MSNN, the proposed approach has better detection performance.

Keywords: BSI, Detection template, Object detection, Vehicle, Aircraft, High resolution satellite imagery

Title of the Paper: On-line Content-Based Image Retrieval System using Joint Querying and Relevance Feedback Scheme


Authors: Wichian Premchaiswadi, Anucha Tungkatsathan

Abstract: In a high-level semantic retrieval process, we utilize the search engine to retrieve a large number of images using a given text-based query. In a low-level image retrieval process, the system provides a similar image search function for the user to update the input query for image similarity characterization. This paper presents an On-line Content-Based Image Retrieval System using joint querying and relevance feedback scheme based on both high-level and low-level features. We also introduce fast and efficient color feature extraction namely auto color correlogram and correlation (ACCC) based on color correlogram (CC) and autocorrelogram (AC) algorithms, for extracting and indexing low-level features of images. To incorporate an image analysis algorithm into the text-based image search engines without degrading their response time, the framework of multi-threaded processing is proposed. The experimental evaluations based on coverage ratio measure show that our scheme significantly improves the retrieval performance of existing image search engine.

Keywords: Joint querying, image retrieval, ACCC, on-line CBIR, CBIR, relevance feedback

Title of the Paper: Investigating Better Multi-layer Perceptrons for the Task of Classification


Authors: Hyontai Sug

Abstract: The task of deciding proper sample sizes for multi-layer perceptrons tends to be arbitrary so that, depending on sample data sets, the performance of trained multi-layer perceptrons has a tendency of some fluctuation. As sample size grows, multi-layer perceptrons have the property that performance in prediction accuracy becomes better slowly with some fluctuation. In order to exploit this property this paper suggests a progressive and repeated sampling technique for better multi-layer perceptrons to cope with the fluctuation of prediction accuracy that depend on samples as well as the size of samples. Experiments with six different data sets in UCI machine learning repository showed very good results.

Keywords: Multi-layer perceptrons, neural networks, data mining, classification

Title of the Paper: A Robust Watermarking Technique for Copyright Protection Using Discrete Wavelet Transform


Authors: Wen-Tzeng Huang, Sun-Yen Tan, Yuan-Jen Chang, Chin-Hsing Chen

Abstract: The arrival of digital world coming soon, the digital media content can be easily altered, duplicated, and spread, which causes the copyright of media are violated. Therefore, attention is to discuss the protection of the intellectual property (IP) rights of digital media. Then, the digital watermarking can be a simple and effective approach to provide copyright protection of IP. In this study, a method of robustness and blind extraction watermark for static images is proposed. It utilizes discrete wavelet transform and applies three coding methods according to the different characteristics of band coefficients: lattice code based on the communication principle, modification of insignificant coefficients based on the just-noticeable distortion of the human visual model, and quantization index modulation based on singular value decomposition. Together, these methods embed a watermark while maintaining image fidelity. From our experimental results in this study, all of them can indicate that the proposed approach is high robust against frequency-based and time domain geometric attacks. Additionally, since our approach produces a blind watermark and thus neither the original image nor any of its related information is needed, it is a very convenient and practical watermarking technique for applications.

Keywords: Digital watermark, Discrete wavelet transform, Copyright protection, Blind watermark, Watermark Extraction procedure

Title of the Paper: Ground Penetrating Radar Slice Reconstruction for Embedded Object in Media with Target Follow


Authors: Qeethara Kadhim Al-Shayea, Itedal S. H. Bahia

Abstract: The detection of embedded object from ground penetrating radar GPR imagery is our goal. The GPR image is a cross sectional slices. The embedded objects are metal and/or plastic type. In many fields demand for visualizing objects scanned as cross sectional slices is growing. This research has many real world applications, such as robotic environments, medicine, remote sensing, inspection of industrial parts and geology. An even better way is to visualize the underground object by reconstruction a three-dimensional model of those objects from the slices. Objects here are stable underground while, camera is moving. The task of object track in a cross sectional slices consists of two parts: first gather information on changes between succeeding slices (object detection), and second process this information appropriately to obtain the track of an object. If the object is like cable or pipe. The proposed method starts with two dimensional 2D image preprocessing for each slice. The preprocessing involves multispectral to gray conversion, contrast enhancement, segmenting, thresholding and denoising to modify each 2D image slice individually. Preprocessing algorithms involved in this paper are chosen appropriately to have image without noise, with object detected and with object eliminated. After a preprocessing step the proposed algorithm for object detection starts with objects contour finding in each slice, 2D objects transparency and transformation. The last step is the proposed interpolation technique to build the successive slices until the spaces is filled to find out the embedded object.

Keywords: Object Detection, Object Recognition, Ground Penetrating Radar (GPR), Volume Reconstruction, Interpolation, Image Processing, Target Follow

Title of the Paper: A Knowledge-Based System for Knowledge Management Capability Assessment Model Evaluation


Authors: Javier Andrade, Juan Ares, Rafael Garcia, Santiago Rodriguez, Sonia Suarez

Abstract: It is now commonly accepted that high quality knowledge management programmes lead to competitive advantages for the organizations. Several knowledge management maturity models have been proposed with the aim of evaluating the quality of knowledge management programmes in organizations. These models are classified into two large groups: CMM-based models and models that are not CMM-related. One of the best known CMM-based models is the Knowledge Management Capability Assessment (KMCA) model. Even so, the acquisition of a knowledge management level may imply a considerable amount of audits. It is therefore very interesting to minimise the costs by paying only for the truly indispensable audits. This article proposes a Knowledge-Based System that makes it possible to evaluate an organization at a KMCA maturity level. It limits the services of an auditor to those cases in which the system’s response complies with the requested knowledge management maturity level. This clearly implies an important cost reduction for audits with negative results. The design of this system is based on the CommonKADS methodology, and its implementation was carried out with the Clips tool.

Keywords: Audit, Clips, CommonKADS, Knowledge-Based System, Knowledge Management, Maturity Model

Title of the Paper: A Sweeping Fingerprint Verification System using the Template Matching Method


Authors: Sun-Yen Tan, Wen-Tzeng Huang, Chin-Hsing Chen, Yuan-Jen Chang

Abstract: Electronic products have adopted the technology of the fingerprint recognition system to offer protection mechanism. The characteristics of Line-Sensor are smaller size, lower power consumption, and lower cost. Therefore, it is suitable for designing fingerprint sensor in embedded systems. Generally, a Line-Sensor builds up a completed picture through assembling the image segments received after one sweeping the fingerprint. However, sometimes, the sweeping rate is unstable and the pressing strength is varied. To avoid such problems influencing the final results, the Fast Normalized Cross Correlation (FastNCC) algorithm is employed here to process and correct the image data. Moreover, the Group Delay Spectrum (GDS) and Dynamic Programming (DP) are implemented to complete fingerprint comparisons. Two different fingerprint matching algorithms are to exploit and demonstrate the verification rate of the system. FastNCC results in 93.8% of the verification rate, while SSD obtains 92.5% of the verification rate.

Keywords: Line-Sensor, Normalized Cross Correlation, Template matching, GDS, DP matching, Threshold

Title of the Paper: Providing Flexibility in the Vehicle Route Optimization for the AVL based Transport Monitoring System


Authors: S. Masrom, Siti Z. Z. Abidin, P. N. Hashimah, A. S. Abd. Rahman

Abstract: Automated Vehicle Location (AVL) based Transport Monitoring System (TMS) has generated some interest within the research community since decades ago. The emergence of enabling technologies such as the Global Positioning System (GPS) and advanced cellular communication systems has opened up new opportunity to this area of research. Vehicle location is now determined in real time and data can be transferred instantaneously to either centralized or decentralized processors. Implementation of a full fledge TMS requires components such as communication module, Geographical Information System (GIS), databases and monitoring application to be incorporated into the system architecture. Once all the elements are gathered, the project will inevitably become very complex and difficult to deploy efficiently and in a timely manner. In this paper, we propose a new approach for the design and implementation of a TMS. The approach that we are proposing relies on the strength and modular nature of a language-based platform. The platform will not only make the project more manageable but will also enhance it by offering extended collaborative features. It will also allow modular experimentation of various optimization algorithms in different Vehicle Routing Problem (VRP) model to be designed and tested for better system performance. With the proposed framework, AVL based TMS can be built economically and efficiently. Analysis and comparison among current AVL based systems are also performed in order to investigate the feasibility of the project.

Keywords: Automated Vehicle Location (AVL), Collaborative environment, Multimedia communication, Vehicle Routing Problem (VRP), Route optimization, Scripting language

Title of the Paper: A Model Driven Engineering Design Approach for Developing Multi-Platform User Interfaces


Authors: Eman Saleh, AMR Kamel, Aly Fahmy

Abstract: The wide variety of interactive devices and modalities an interactive system must support has created a big challenge in designing a multi-platform user interface and poses a number of issues for the design cycle of interactive systems. Model-Based User Interface Design (MBUID) approaches can provide a useful support in addressing this problem. In MBUID the user interface is described using various models; each describes a different facet of the user interface. Our methodology is based on task models that are attributed to derive a dialog model, from which different concrete models with different appearances can be generated. This paper presents a semi-automatic Model-Based transformational methodology for multi-platform user interface (MPUI) design. The proposed methodology puts dialog modeling in the center of the design process. A core model is integrated in the design process namely our Dialog-States Model (DSM); which represents our initial step to adapting to multiple target platforms by assigning multiple Dialog-State models to the same task model. A multi-step reification process will be taken from abstract models to more concrete models until reaching a final user interface customized according to the target platform.

Keywords: ConcurTaskTrees, Dialog model, Model-Based User Interface Design, StateCharts, UsiXML

Issue 6, Volume 9, June 2010

Title of the Paper: Automating Ontology Based Information Integration Using Service Orientation


Authors: Bostjan Grasic, Vili Podgorelec

Abstract: With the rise of the Internet, globalization and the increasing number of applications used inside organizations, there is an emerging need to integrate information across heterogeneous information systems. Service oriented architecture (SOA) is seen as a general answer to intraorganisational as well as interorganisational integration problems. While service oriented systems have been well studied, there are still some challenges remaining unanswered. One of them is automation of service execution. This paper proposes a method for automated execution of Web Services. Based on Web Service execution automation, the proposed approach is bridging the gap between ontology based integration and service oriented architecture by enabling dynamic and transparent integration of information which is provided by services.

Keywords: Service execution, Web services, Soa, Information integration, Semantic Web

Title of the Paper: A Framework for Eliciting Value Proposition from Stakeholders


Authors: Ghulam Murtaza, Naveed Ikram, Abdul Basit

Abstract: Eliciting the value proposition in Value Based Software Engineering (VBSE) is critical. Everything within VBSE is dependent on the value propositions of success critical stakeholder (SCSs). This paper present presents a novel approach for elicitation of value from the SCSs from different dimensions. We propose a Value Elicitation Framework (VEF) in order to resolve the problem of selection and application of appropriated value elicitation technique for a given situation. We applied the VEF on a small commercial project to demonstrate the execution of VEF in practice and evaluate its effectiveness. Results show that decision makers felt more confident in decision making while using VEF as decisions are taken on basis of actual value rather than mere guess. We also found that SCSs were mainly using Business, Economic and Technical Values in making decisions.

Keywords: Value Elicitation, Success Critical Stakeholders (SCS), Stakeholder Identification, Stakeholder Identification Techniques, Value Elicitation Techniques, Value Dimensions, Value Elicitation Framework

Title of the Paper: Visual Microcontroller Programming Using Extended S-System Petri Nets


Authors: Kok Mun Ng, Zainal Alam Haron

Abstract: In this paper, we present the development work on a visual microcontroller programming tool based on an extended form of S-System Petri Nets (SSPN). By using the extended form of SSPNs we were able to describe in visual form subroutines, interrupts, I/O operations, arithmetic operations, and other programming constructs in a microcontroller application program. Construction of the visual programming tool included the development of a drawing editor which utilized directed graphs as internal model for created SSPN diagrams, and a parser to check for correct diagram and sentence syntax. The parser developed uses context-free diagram and string grammars for the diagram and sentence syntax checking, and upon successful parsing the tool automatically translates the SSPN-represented application program into assembly code for a target microcontroller.

Keywords: S-System Petri Net (SSPN), S-System Petri Net Generator (S-PNGEN), Directed Graph Structure, Context-free Graph Grammar, Graph Transformation

Title of the Paper: Online PCA with Adaptive Subspace Method for Real-Time Hand Gesture Learning and Recognition


Authors: Minghai Yao, Xinyu Qu, Qinlong Gu, Taotao Ruan, Zhongwang Lou

Abstract: The learning method for hand gesture recognition that compute a space of eigenvectors by Principal Component Analysis(PCA) traditionally require a batch computation step, in which the only way to update the subspace is to rebuild the subspace by the scratch when it comes to new samples. In this paper, we introduce a new approach to gesture recognition based on online PCA algorithm with adaptive subspace, which allows for complete incremental learning. We propose to use different subspace updating strategy for new sample according to the degree of difference between new sample and learned sample, which can improve the adaptability in different situations, and also reduce the time of calculation and storage space. The experimental results show that the proposed method can recognize the unknown hand gesture, realizing online hand gesture accumulation and updating, and improving the recognition performance of system.

Keywords: Online Learning , Online PCA , Adaptive Subspace , Camshift, Hand Tracking, Hand Gesture Recognition

Title of the Paper: The New Architecture of Chinese Abacus Multiplier


Authors: Chien-Hung Lin, Yun-Fu Huang, Der-Her Lee, Pao-Hua Liao, Chih-Wei Hsu

Abstract: This study demonstrated a 4x4 bits multiplier that was based on the Chinese abacus. Comparing the simulation results of this work with the speed and power consumption of the 4x4 bits Braun array multiplier, this 4x4 bits abacus multiplier showed a 19.7% and 10.6% delay improvement in 0.35?m and 0.18?m technology respectively than that of the 4x4 bits Braun array multiplier, while power consumption of the 4x4 bits abacus multiplier was 8.7% and 18% lower respectively.The performance: power-consumption*delay of the abacus multiplier is respectively,less about 23.2% and 23.5% also.

Keywords: Performance, Tree-based multipliers, Braun array multiplier,Function table, Chinese abacus multiplier, Delays, Thermometric

Title of the Paper: Introducing an Intelligent Computerized Tool to Detect and Predict Urban Growth Pattern


Authors: Siti Z. Z. Abidin, M. N. Fikri Jamaluddin, M. Zamani Z. Abiden

Abstract: Urban growth pattern is usually detected using spatial analysis. Spatial analysis is widely used in scientific research especially in the field of statistics, image processing and geoinformatics. In modeling urban growth, the analysis is mostly performed using statistical and mathematical techniques. With the advance computer technology, physical land (ground) situation for a place of interest can be represented in digital computerized form with the accurate and appropriate scale. In this way, measurement can be made on the digitized presentation for performing analysis. The change in land use is affected by many factors such as population growth, economic change, social structure, the change in rules and regulation, and many more. These influential factors have dynamic behaviors that require complex solutions. Much research has been undertaken to use several methods such as geographical information system (GIS) and cellular automata theory, to model the urban growth. Recently, an intelligent approach has been introduced that features dynamic behavior. Artificial Neural Network (ANN) has the capability to learn dynamic behavior and performs prediction based on its learning process. In this paper, we present an intelligent computerized tool, called DIGMAP-Detector. This tool is able to learn a pattern of urban growth based on at least two digital maps (with 4-bit/pixel bitmaps or 8-bit/pixel bitmap in Bitmap File Format (BMP)). Implemented using Java programming language, the tool reads digital map files with the size of 847 pixels length and 474 pixels width. Classification on the map with two independent binary classes (value 1 for urban and 0 for rural) are prepared using GIS software. By applying a cellular automata theory that considers the affect on a center pixel is influenced by its surrounding pixels (eight pixels), the tool uses a back propagation neural network to read the values of surrounding pixels as its input layer nodes and the center pixel as the output node. Several analyses are performed to determine the appropriate values for the neural network configuration before its learning engine starts to learn the pattern of dynamic urban changes based on the digital map patterns. When the neural network engine has learnt the pattern, prediction can be carried out to predict the missing years and future urban growth. With good prediction accuracy, urban planning and monitoring can be performed with maintaining good ecological and environmental system. In addition, better planning also gives benefit to economic values.

Keywords: Computerized tool, Artificial Neural Network, Digital map, Urban Growth, Urban pattern, Cellular automata

Title of the Paper: Traveling Wave Solutions for the (2+1) Dimensional Boussinesq Equation and the Two-Dimensional Burgers Equation by (G''/G')-Expansion Method


Authors: Bin Zheng

Abstract: In this paper, we demonstrate the effectiveness of the (G''/G')-expansion method by seeking more exact solutions of the (2+1) dimensional Boussinesq equation and the two-dimensional Burgers equation. By the method, the two nonlinear evolution equations are separately reduced to non-linear ordinary differential equations (ODE) by using a simple transformation. As a result, the traveling wave solutions are obtained in three arbitrary functions including hyperbolic function solutions, trigonometric function solutions and rational solutions. When the parameters are taken as special values, we also obtain the soliton solutions of the fifth-order Kdv equation. The method appears to be easier and faster by means of a symbolic computation system.

Keywords: (G''/G')-expansion method, Traveling wave solutions, (2+1) dimensional Boussinesq equation, twodimensional Burgers equation, exact solution, evolution equation, nonlinear equation

Title of the Paper: New Exact Traveling Wave Solutions for Three Nonlinear Evolution Equations


Authors: Bin Zheng

Abstract: In this paper, we demonstrate the effectiveness of the (G''/G')-expansion method by seeking more exact solutions of the SRLW equation, the (2+1) dimensional PKP equation and the (3+1) dimensional potential-YTSF equation. By the method, the two nonlinear evolution equations are separately reduced to non-linear ordinary differential equations (ODE) by using a simple transformation. As a result, the traveling wave solutions are obtained in three arbitrary functions including hyperbolic function solutions, trigonometric function solutions and rational solutions. When the parameters are taken as special values, we also obtain the soliton solutions of the fifth-order Kdv equation. The method appears to be easier and faster by means of a symbolic computation system.

Keywords: (G''/G')-expansion method, Traveling wave solutions, SRLW equation, (2+1) dimensional PKP equation, (3+1) dimensional potential-YTSF equation, exact solution, evolution equation, nonlinear equation

Title of the Paper: The Evaluation Of Nearly Singular Integrals in the Direct Regularized Boundary Element Method


Authors: Yaoming Zhang, Yan Gu, Bin Zheng

Abstract: The numerical analysis of boundary layer effect is one of the major concerned problems in boundary element method (BEM). The accuracy of this problem depends on the precision of the evaluation of the nearly singular integrals. In the boundary element analysis with direct formulation, the hyper-singular integral will arise from the potential derivative boundary integral equations (BIEs). Thus the nearly strong singular and hyper-singular integrals need to be calculated when the interior points are very close to the boundary. For nearly hyper-singular integrals, it is thought, generally, more difficult to calculate. In this paper, a general nonlinear transformation is adopted and applied to calculating the potential and its derivative at the interior points very close to the boundary. Numerical examples demonstrate that the present algorithm is efficient and can overcome the boundary layer effect successfully even when the interior points are very close to the boundary.

Keywords: BEM, potential problems, nearly singular integrals, boundary layer effect, transformation, Numerical method

Title of the Paper: Social Awareness: The Power of Digital Elements in Collaborative Environment


Authors: Zainura Idrus, Siti Z. Z. Abidin, R. Hashim, N. Omar

Abstract: Awareness is the sense of what is happening, who is around, what they are doing, what their states of emotion and whether or not they notice you. Social awareness and how they are promoted by digital elements in networked collaborative virtual environment (NCVE) is the main focus of this study. Social awareness is defined as the understanding of a contextual situation at a present time. In networked collaborative virtual environments (NCVE), awareness plays an important role for achieving an effective digital communication. For a particular virtual environment, participants should be aware of the people whom they are interacting with, their responsibilities and contributions, the collaborative activities and their progress level. This paper discusses the digital elements that are used to support awareness during virtual collaboration by exploring their characteristics and the differences in terms of their specific roles in promoting awareness. With the main focus on social awareness, eight different awareness types (presence, turn taking, emotion, identities, state, role, contextual and conversational) are presented with respect to these digital elements. The impacts of using each digital element in various applications are also identified in order to enhance the usage of these elements when they are applied to their relevant networked collaborative applications. Thus, with appropriate use of digital elements, awareness in such situation can be improved.

Keywords: Networked Collaborative Virtual Environment (NCVE), Awareness, Digital elements, Social awareness, Communication, Interactive collaborative applications.

Title of the Paper: A New Accurate Technique for Iris Boundary Detection


Authors: Mahboubeh Shamsi, Puteh Bt Saad, Subariah Bt Ibrahim, Abdolreza Rasouli, Nazeema Bt Abdulrahim

Abstract: The Iris based identification systems have been noticed exceedingly, presently. In this process, the iris should be segmented from captured eye image. The low contrast between pupil and iris, usually, will harden the segmenting process and will decrease the accuracy of detecting boundary between them. In order to segment the iris more precise, we propose a new technique using a difference function and a factor matrix. We also modify the segmentation operator to detect the pupil and iris as ellipse instead of circle. Experiments show that the proposed technique can segment the iris region and pupil region precisely. Based on our result, 99.34% of eyes have been segmented accurately in 1.24s averagely.

Keywords: Daugman’s method, Average Square Shrinking, Difference Function, Contour Factor Matrix

Issue 7, Volume 9, July 2010

Title of the Paper: A Novel Text Modeling Approach for Structural Comparison and Alignment of Biomolecules


Authors: Jafar Razmara, Safaai B. Deris

Abstract: Within this paper, a novel strategy for structural alignment of proteins based on text modeling techniques is introduced. The method summarizes the protein secondary and tertiary structure in two textual sequences. The first sequence is used to initial superposition of secondary structure elements and the second sequence is employed to align the 3D-structure of two compared structures. The comparison technique used by the method has been inspired from computational linguistics for analysing and quantifying textual sequences. In this strategy, the cross-entropy measure over n-gram models is used to capture regularities between sequences of protein structures. The performance of the method is evaluated and compared with CE and SSM methods. The results of the experiments reported here provide evidence for the preference and applicability of the new approach in terms of efficiency and effectiveness.

Keywords: Protein structure alignment, n-gram modeling, cross-entropy

Title of the Paper: The Theory and Application of an Adaptive Moving Least Squares for Non-uniform Samples


Authors: Xianping Huang, Qing Tian, Jianfei Mao, Li Jiang, Ronghua Liang

Abstract: Moving least squares (MLS) has wide applications in scattering points approximation fitting and interpolation. In this paper, we improve a novel MLS approach, adaptive MLS, for non-uniform sample points fitting. The size of radius for MLS can be adaptively adjusted according to the consistency of the sampled data points. Experiments demonstrate that our method can produce higher quality approximation fitting results than the MLS.

Keywords: MLS, Sample Points, Non-uniform, Points Set, Approximation Fitting, Interpolation

Title of the Paper: Data Simulation of Matern Type


Authors: Jia-Yue Li, Xin Lu, Ming Li, Shengyong Chen

Abstract: Recently, the correlation function of the Matern’s receives increasing interests in geostatistics. This paper discusses our work in synthesizing the random data based on the Matern’s correlation function. The analysis in this paper exhibits that the data of the Matern type are in the domain of fractal time series. The present results suggest that the power spectrum (PSD) based method may be efficiently for synthesizing the random data of the Matern type. We shall explain the reason to select the PSD based method and give the demonstrations of simulations. This paper may yet provide a pavement towards the generation of multidimensional random fields of the Matern type.

Keywords: Random data generation, Geostatistics, Fractal time series, Fractional Langevin equation, Fractional oscillator processes, The Matern correlation function

Title of the Paper: FGN Based Telecommunication Traffic Models


Authors: Ming Li, Wei Zhao, Shengyong Chen

Abstract: This paper addresses three models of traffic based on fractional Gaussian noise (fGn). The first is the standard fGn (fGn for short) that is characterized by a single Hurst parameter. The second is the generalized fGn (GfGn) indexed by two parameters. The third the local Hurst function. The limitation of fGn in traffic modeling is explained. We shall exhibit that the model of GfGn can be used to release that limitation. Finally, we discuss the local Hurst function to interpret that it is a simple model to express the multifractal property of traffic on a point-by-point basis.

Keywords: Internet traffic modeling; Fractional Gaussian noise; Fractal time series; Statistical computing

Title of the Paper: Web Client Server Systems with Advanced Asynchronous Communication, their Architecture and Applications


Authors: Filip Maly, Antonin Slaby

Abstract: This article extends ideas and results presented in our contribution in Hangzhou WSEAS conference. In the paper we describe one way of solution of important and difficult problem of finding suitable ways of communications in special client server systems having hybrid and as to functionality point of view restricted client. We also describe basic improvements of client enabling running of more sophisticated software processes. In addition to it there is shown common concept and architecture of web client server systems that use asynchronous communication through one always open port (port No. 80). These systems have wide use and a lot of applications. On of them is mentioned in case study.

Keywords: Client server systems, asynchronous communication, hybrid client, software processes, learning management systems

Title of the Paper: Developing Question Answering (QA) Systems using the Patterns


Authors: Maria Moise, Ciprian Gheorghe, Marilena Zingale

Abstract: This paper describes a way to use Natural Language Processing (NLP) techniques in order to gain faster access to information from a closed domain. Using traditional graphical user interfaces built as a tree structure, such as menus, determine users to browse irrelevant information. To avoid this, we use one of the major tasks in NLI, Question Answering (QA) and build with Visual FoxPro 9.0 a demonstrating application. We impose restrictions related to the domain of use that of tourist attractions in Romania, and restrictions related to natural language input method. We use tags to represent words or groups of words and patterns to symbolize questions. We use this representation in order to provide to user a non-traditional human-computer interaction. The QA system rely on a database where we store specific information on chosen domain and related data in order to accomplish the task of providing an answer. The answering mechanism uses a knowledge engineering with syntactic and semantic approach. At the end of this paper we discuss the QA System Evaluation.

Keywords: Tag, Pattern, Natural Language Processing, Question Answering

Title of the Paper: The Malaysia IT Outsourcing Industry Skill-Sets Requirements of Future IT Graduates


Authors: Abdul Rahman Ahlan, Yusri Arshad, Mohd Adam Suhaimi, Husnayati Hussin

Abstract: IT changes very quickly and influences business, industry and the public in an enormous manner. The issue of IT outsourcing (ITO) impact on IT workforce has been discussed widely in many developed countries so much so that the concerns have been tense in that more IT jobs will be transferred to developing countries which provide IT outsourcing services. In addition, with the growth of IT outsourcing and emerging new outsourcing business models such as utility demand, application service provider, business process outsourcing, offshore outsourcing and many more servicing different industry vertical sectors increase the demand for multiple IT skills and capabilities of IT workforce. In this study, we review relevant literature, newspapers and non-academic articles, websites of companies, governments and non-governmental organisations and others. An outsourcing Malaysia (OM) CEO roundtable discussion is also held to get inputs from the practitioners on the topic. In addition, we seek in-depth insights from seven senior executive managements in service provider firms on the skill sets requirements of fresh IT graduates to fulfill the market needs of IT outsourcing in Malaysia. The four-member research team found that technical, soft and problem-solving skills are the main concerns raised by the key informants. This is in line with the literature review and also the present higher education policy concerns by the Malaysia government.

Keywords: Malaysia, IT outsourcing, skill-sets, workforce, curriculum, developing country

Title of the Paper: An Comparative Analysis of the Current Status of Digital Divide in Taiwan


Authors: Ruey-Gwo Chung, Chih-Wei Li, Chen-Liao Chen

Abstract: Information technology and network communications are playing a very important role in today's modern society, and they have a significant influence on the policy-making in the country. In education, their importance is reflected by the sought-after IT courses and information-integrated teaching. “Digital divide” is an issue caused by a delay in the introduction of information technology to different ethnic and social groups, lower availability of information equipment, and different abilities to access the Internet. In order to understand the accessibility of and ability to access information equipment in different regions in Taiwan, the Research, Development, and Evaluation Commission of the Executive Yuan (RDEC) started conducting surveys on domestic digital divide since 2001 and using the findings as references for formulating policies that address the issue. The purpose of this paper is to use secondary data to analyze the latest “2009 Digital Divide Survey” by RDEC and to describe the current status of digital divide in Taiwan. Research indicates that at the moment, digital divide in Taiwan is mainly caused by a lack of the willingness and ability to learn among those with a lower educational background and/or who are older individuals. This is significantly different from the causes of digital divide found in 2002 which included expensive computer equipment and Internet access fees, poor connectivity and bandwidth, and a lack of access in public places. When aiming at reducing digital divide, the government is advised to adjust its strategies accordingly and meet the changes in society.

Keywords: Digital divide, information technology, technical network, disadvantaged groups, information literacy, information education

Title of the Paper: Craniofacial Reconstruction Based on MLS Deformation


Authors: Li Jiang, Xiangyin Ma, Yaolei Lin, Lewei Yu, Qianwei Ye

Abstract: Craniofacial reconstruction aims at estimating the facial outlook associated to an unknown specimen. Craniofacial reconstruction is generally based on the statistical tissue thickness on anthropometric landmarks. However, the features points alone are not enough for realistic reconstruction. So, in our paper, we take advantage of a reference facial model, through measuring the differences between the target facial feature points and the reference facial feature points, a novel craniofacial reconstruction algorithm based on moving least squares deformation is presented. The 3D skull mesh model is obtained with Marching Cube Algorithm, which extract the iso-surfaces from a complete head CT slices datum. The holes detected in the 3D skull model can be repaired with different methods after the holes are clarified. Then, the craniofacial is reconstructed using MLS deformation with the constraint of a reference facial model. The experimental results show that the methods can produce more desirable results than others’.

Keywords: Craniofacial reconstruction, Deformation, MLS, Feature points, Hole repairing

Title of the Paper: Using Multiple Imputation to Simulate Time Series: A Proposal to Solve the Distance Effect


Authors: Sebastian Cano, Jordi Andreu

Abstract: Multiple Imputation (MI) is a Markov chain Monte Carlo technique developed to work out missing data problems, specially in cross section approaches. This paper uses Multiple Imputation from a different point of view: it intends to apply the technique to time series and develops that way a simpler framework presented in previous papers. Here, the authors’ idea consists basically on an endogenous construction of the database (the use of lags as supporting variables supposes a new approach to deal with the distance effect). This construction strategy avoids noise in the simulations and forces the limit distribution of the chain to convergence well. Using this approximation, estimated plausible values are closer to real values, and missing data can be solved with more accuracy. This new proposal solves the main problem detected by the authors in [1] when using MI with time series: the previously commented distance effect. An endogenous construction when analyzing time series avoids this undesired effect, and allows Multiple Imputation to benefit from information from the whole data base. Finally, new R computer code was designed to carry out all the simulations and is presented in the Appendix to be analyzed and updated by researchers.

Keywords: Missing data, Multiple Imputation, Time Series, MCMC, Simulation Algorithms, R Programming

Title of the Paper: A Real-Time Data Acquisition System for the Laguna Verde Nuclear Power Plant


Authors: Ilse Leal Aulenbacher, Jose Maria Suarez Jurado, Efren R. Coronel Flores

Abstract:This paper focuses on a Data Acquisition Real-Time System developed for the Laguna Verde Nuclear Power Plant in Veracruz, Mexico. Due to the fact that the data acquisition modules needed to be replaced, the need for a New Acquisition System arose. Replacing the former data acquisition system has several technical challenges, including the fact that it must work together with the former system. Additionally, it must remain online during the whole process, since the Data Acquisition System is one of the nuclear power plant most important monitoring systems and it is required for its operation. This paper focuses on the key software design aspects of the New Acquisition System, which were used to address the main system requirements: Integration with the former data system, multiple data source support, high-frequency data storage and high-speed data access. The New Acquisition System is composed of a vast number of modules. This paper specifically focuses on Acquisition Subsystems, the Central Acquisition Process and the Master Historical Archive Process.

Keywords: Data acquisition system, Real-time, Linux, Software, Design, Archive, Acquisition Subsystem

Title of the Paper: The Access of Persons with Visual Disabilities at the Scientific Content


Authors: Narcisa Isaila, Ion Smeureanu

Abstract: The access to scientific content, for people with disabilities, involves a number of problems both in reading web documents, which contain the elements as images, with or without alternative text and reading from files, especially if these contain mathematical expressions. From this point of view, the applications' utility for synthesizing artificial language, starting from a written text in Romanian language, is undeniable, given that, the concerns have increased for creation of facilities designed to contribute to a better integration of persons with disabilities in society. The purpose of this paper is the result of research into education, for the access to science of people with disabilities, by integrating the facilities the audio presentation of information, in Romanian and provides a way to integrate assistive components in an open source learning platform.

Keywords: Accessibility, Vocal synthesizers, Assistive technologies, Web service, Mathematics, W3C

Issue 8, Volume 9, August 2010

Title of the Paper: Program-Operators to Improve Test Data Generation Search


Authors: Mohammad Alshraideh, Mohammad Qatawneh, Wesam Al Mobaiden, Azzam Sleit

Abstract: There has recently been a great deal of interest in search based test data generation, with many local and global search algorithms being proposed. In this paper, the program operations, in the form of the program-specific operations used to increase the performance in the generation of test data. The efficacy and performance of the proposed testing approach is assessed and validated using a variety of sample programs, and the empirical investigation is shown to give more than eightfold increase in performance.

Keywords: Genetic algorithms, program-Search operators, cost function, branch-coverage, software testing, dynamic testing, search-based

Title of the Paper: Alternative A(H1N1) Suspects Management


Authors: Dan Adrian Marior, Radu Zglimbea, Constantin Carciumaru

Abstract: This paper deals with the differential diagnostics process behind the verdict of AH1N1 infection or not for the swine flu suspects. The application was designed as a support for doctors was built with ASP.NET 3.5 technology, the broad spectrum development platform, and it enabled us to create a web application that could be accessible from anywhere on the internet, assist the doctor in the diagnostics process, filter out false positives, manage the patients, and also generate sets of statistical charts for case evolution analysis.

Keywords: A(H1N1), application, diagnostic, management, charts

Title of the Paper: Web Page Analysis Based on HTML DOM and Its Usage for Forum Statistics, Alerts and Geo Targeted Data Retrieval


Authors: Robert Gyorodi, Cornelia Gyorodi, George Pecherle, George Mihai Cornea

Abstract: Message boards are part of the Internet known as the 'Invisible Web' and pose many problems to traditional search engine spiders. The dynamic content is usually very deep and difficult to search. In addition, many of these sites change their locations, servers, or URLs almost daily creating problems with the indexing process. However, during the growth of the World Wide Web and with the help of search engines, they represent an important source of information to solve different problems. Another interesting feature of this type of web pages is that a big community has been developed, expressing different opinions and discussing various topics. Using special retrieval and indexing algorithms, mostly based on the HTML DOM tree, we have developed an algorithm to obtain detailed and accurate trend statistics that can be used for different marketing solutions and analysis tools. Combined with the services provided by traffic ranking sites like, we can also provide geo targeting functionality to deliver even more accurate results to the end user, such as what percentage of the users who are visiting a certain forum is coming from a certain country.

Keywords: Data analysis, data models, HTML DOM, information extraction, text recognition, geo targeting

Title of the Paper: Using Text Mining Techniques in Electronic Data Interchange Environment


Authors: Zakaria Suliman Zubi

Abstract: The internet is a huge source of documents, containing a massive number of texts in multilingual languages on a wide range of topics. These texts are demonstrating in an electronic documents format hosted on the web. The documents exchanged using special forms in an Electronic Data Interchange (EDI) environment. Using web text mining approaches to mine documents in EDI environment could be new challenging guidelines in web text mining. Applying text-mining approaches to discover knowledge previously unknown patters retrieved from the web documents by using partitioned cluster analysis methods such as k- means methods using Euclidean distance measure algorithm for EDI text document datasets is unique area of research these days. Our experiments utilize the standard K-means algorithm on EDI text documents dataset that most commonly used in electronic interchange and we report some results using text mining clustering application solution called WEKA. This study will provide high quality services to any organization that is willing to use the system.

Keywords: Electronic Data Interchange (EDI), Web Mining, Text Mining, Clustering, K-mean algorithm, Similarity Measures, Partitioned Cluster Analysis

Title of the Paper: Concept Structure based on Response Pattern Detection of S-P Chart with Application in Algebra Learning


Authors: Jeng-Ming Yih, Yuan-Horng Lin

Abstract: The main purpose of this study is to provide an integrated method for personal concept structure analysis. Based on the utility of S-P chart (student problem chart) to deal with classification for learning style, students of different learning style display its own features of concept structure. In this study, S-P chart is used to classify learning styles of students. Concept structure analysis (CSA) could display personalized concept structure. CSA algorithm is the major methodology and its foundation is fuzzy logic model of perception (FLMP) and interpretive structural modeling (ISM). CSA could clearly represent hierarchies and linkage among concepts. Therefore, CSA will be effectively to display features of personal concept structures. In this study, an empirical data for concepts of linear algebra from university students is discussed. The results show that students of varied learning styles own distinct knowledge structures. CSA combined with S-P chart could be feasible for cognitive diagnosis. According to the findings and results, some suggestions and recommendations for future research are discussed.

Keywords: Concept structure, cognitive diagnosis, FLMP, ISM, S-P chart

Title of the Paper: Integrated Thallassaemia Decision Support System


Authors: Rached Omer Agwil, Divya Prakash Shrivastava

Abstract: Thalassaemia is a genetic blood disorder where the blood cells are unable to carry sufficient oxygen supply for the organs. The Thalassaemia has a distribution concomitant with areas where P. Falciparum malaria is common. The alpha Thalassaemia is concentrated in Southeast Asia, Malaysia, and southern China. The beta Thalassaemia is seen primarily in the areas surrounding Mediterranean Sea, Africa and Southeast Asia. Due to global migration patterns, there has been an increase in the incidence of Thalassaemia in North America in the last ten years, primarily due to immigration from Southeast Asia (The Reader’s Digest, 1989). The increment of the cases forces the government hospital to have an alternative method in diagnosing Thalassaemia than only depend on the Hematologist. Case-Based Reasoning is a subset of an Artificial Intelligence technique utilizes the specific knowledge of previous experiences to solve new problems by remembering previous similar cases. It is highly suitable for medical domain. Hence, this work provides model that demonstrates how Thalassaemia can be diagnosed via Integrated Thalassaemia Decision Support System (ITDSS).

Keywords: DSS, CBR, ITDSS, Thallassaemia , Medical Application, Diagnose Disease, AI

Title of the Paper: The Use of Domain Ontologies for the Virtual Scenes Management


Authors: Crenguta Bogdan, Dorin Mircea Popovici

Abstract: In this paper, an object-oriented software system for the management of virtual scenes is presented. This system, called OntSceneBuilder, uses a domain ontology in order to obtain at least three benefits: accuracy, ease of content reuse and management of virtual scenes. The accuracy is ensured by default, since any ontology provides a precise specification of the concepts and their relations of a domain. Each concept is associated with 2D and 3D resources and a virtual artifact. The graph of the virtual artifacts forms a virtual scene of a virtual exposition. The paper also presents some models of the system development process, mainly realized during the analysis and design activities. Our aim was to analyze the OntSceneBuilder from the functional and interactional viewpoints and to create its software use case diagram. Furthermore, each software use case was designed from the structural and dynamical viewpoints. At the same time we also constructed the system software architecture. Some classes of the software architecture manage the concepts of the domain ontology associated with the topic chosen by user. The system has been experimented in the realization of virtual scenes of a virtual historic exposition.

Keywords: Virtual scene, domain ontology, concept, virtual artifact, software analysis, software architecture, virtual historic exposition

Title of the Paper: A Potable Biometric Access device using Dedicated Fingerprint Processor


Authors: Hatim A. Aboalsamh

Abstract: Biometric signatures, or biometrics, are used to identify individuals by measuring certain unique physical and behavioral characteristics. Individuals must be identified to allow or prohibit access to secure areas—or to enable them to use personal digital devices such as, computer, personal digital assistant (PDA), or mobile phone. Virtually all biometric methods are implemented using the following 1) sensor, to acquire raw biometric data from an individual; 2) feature extraction, to process the acquired data to develop a feature-set that represents the biometric trait; 3) pattern matching, to compare the extracted feature-set against stored templates residing in a database; and 4) decision-making, whereby a user’s claimed identity is authenticated or rejected. A typical access control system uses two components. First component is a fingerprint reader that is connected to a database to match the pre stored fingerprints with the one obtained by the reader. The second component is an RFID card that transmits information about the person that requests an access. In this paper, a compact system that consists of a CMOS fingerprint sensor (FPC1011F1) is used with the FPC2020 power efficient fingerprint processor ; which acts as a biometric sub-system with a direct interface to the sensor as well as to an external flash memory for storing finger print templates. The small size and low power consumption enables this integrated device to fit in smaller portable and battery powered devices utilizing high performance identification speed. An RFID circuit is integrated with the sensor and fingerprint processor to create an electronic identification card (e-ID card). The e-ID card will pre-store the fingerprint of the authorized user. The RFID circuit is enabled to transmit data and allow access to the user, when the card is used and the fingerprint authentication is successful.

Keywords: Access control, RFID, Fingerprint processor, Fingerprint authentication, Biometrics

Title of the Paper: Software Implementation of Hydraulic Shock Numerical Computation in the Pressure Hydraulic Systems without Protection Devices


Authors: Ichinur Omer

Abstract: This paper presents software for calculus of hydraulic shock phenomenon in pressure hydraulic systems without protection device. The program is written in Java programming language and responds to the following requirements: easy management of several projects, easy introduction, editing and change of data entry, and proper display of the program output: the hydraulic load and speed for every moment of time. To numerical solve the equations of the hydraulic shock phenomenon we apply the method of characteristics. The output data are the values of speeds and hydraulic loads of the sections along the pipeline, elements that can assess system behavior in a given situation.

Keywords: Hydraulic shock (water hammer), pressure hydraulic system, method of characteristics, software, flowchart, graphical interface

Title of the Paper: Influence of Noise on the Results of Rigid Registration of Segmented Ovarian Volumes Using Spherical Correlation in Frequency Domain


Authors: Boris Cigale, Damjan Zazula

Abstract: The influence of noise on the results of rigid registration of segmented ultrasound volumes is studied in this paper. Binary volumes result from a segmentation of ovarian ultrasound volumes. Rigid registration is preformed in frequency domain, where the rotation and translation can be calculated separately. The calculation of rotation is done using the amplitude spectrum and sphere correlation. The method was tested on pairs of synthetic volumes where ovarian follicles in one volume were altered and, thus, simulated different kinds of noise (non-rigid changes) characteristic for segmented volumes. We systematically assessed the performance of our registration algorithm by changing the number of follicles, their position, orientation and size. Hundred volume pairs were involved in each experiment. The method proved sensitive to the change of follicle size but resistive to all other kinds of destruction we simulated.

Keywords: Image registration, rigid registration in frequency domain, spherical correlation

Title of the Paper: Textual Data Compression Speedup by Parallelization


Authors: Goran Martinovic, Caslav Livada, Drago Zagar

Abstract: When the omnipresent challenge of space saving reaches its full potential so that a file cannot be compressed any more, a new question arises: “How can we improve our compression even more?”. The answer is obvious:”Let/s speed it up!”. This article tries to find the meeting point of space saving and compression time reduction. That reduction is based on a theory in which a task can be broken into smaller subtasks which are simultaneously compressed and then joined together. Five different compression algorithms are used two of which are entropy coders and three are dictionary coders. Individual analysis for every compression algorithm is given and in the end compression algorithms are compared by performance and speed depending on the number of cores used. To summarize the work, a speedup diagram is given to behold if Mr. Amdahl and Mr. Gustafson were right.

Keywords: Data compression, lossless coding, entropy coder, dictionary coder, parallel computing, compression time speed up

Issue 9, Volume 9, September 2010

Title of the Paper: Computer Modeling and Simulation of the Nanoaggregation and Solubility of Crude Oil Asphaltenes


Authors: Francesco Frigerio

Abstract: The methodology of classical molecular dynamics provides useful tools for the simulation of the solution behaviour of asphaltenes. The aggregation and the solubility properties of this class of molecules are studied at a full atomistic level. Average three-dimensional asphaltene models are built on the basis of experimental data, collected from a series of crude oil samples. The simulation of two such asphaltene models in four different solvents puts into evidence the formation of oligomeric clusters. Their analysis gives clues to the initial stages of asphaltene aggregation at the nanometer scale. The Hildebrand solubility parameter is calculated for the whole collection of asphaltene models. This is a practical example of obtaining useful physicochemical properties from molecular simulations applied to average asphaltene structures.

Keywords: Simulation, Model building, Molecular dynamics, Asphaltene, Oil, Aggregation, Solubility, Hildebrand

Title of the Paper: A Novel Object Detection Approach Based on the Boundary Shape Information from High Resolution Satellite Imagery


Authors: Xiaoshu Si, Xuemin Hu, Hong Zheng

Abstract: This paper presents a novel approach of detecting special objects from high resolution satellite imagery. In this approach, a bilateral filtering is used to reduce noise, and a new hybrid morphological approach is proposed for ROI extraction and feature enhancement. A detection operator based on the boundary shape information (BSI) is developed to detect preprocessed objects. The images from Google Earth are tested in the paper. Comparisons with popular object detection approaches are also discussed. The experimental results show that proposed approach is effective and feasible.

Keywords: BSI, Detection template, Object detection, High resolution satellite imagery, Vehicles, Aircrafts

Title of the Paper: Data Wiping System with Fully Automated, Hidden and Remote Destruction Capabilities


Authors: George Pecherle, Cornelia Gyorodi, Robert Gyorodi, Bogdan Andronic

Abstract: In this article, we will describe a method to securely erase sensitive data in fully automated and hidden mode and with remote data destruction capabilities. Compared to other similar technologies, our method has two main advantages. The first one is the ability to run in a fully automated mode, in other words the system is configured once and the computer is protected without requiring any user intervention. The second advantage is the ability to run in a so-called hidden mode, in which the system looks like a different software, for the main purpose of confusing other users. Also, our system can be useful to prevent data loss in stolen laptops, by triggering remote wiping of sensitive data. This is done by overwriting the encryption key of an encrypted volume, that makes the data completely unrecoverable. Some tests and results that show data is not recoverable are also presented at the end of the paper. We will describe the structure and functionality of our system, and some of the most important technologies and algorithms that we have used.

Keywords: Security, data wiping, data recovery, automation, scheduling, patterns, overwrite data

Title of the Paper: Modelling using UML Diagrams of an Intelligent System for the Automatic Demonstration of Geometry Theorems


Authors: Anca Iordan, Manuela Panoiu, Ioan Baciu, Corina Daniela Cuntan

Abstract: In this work will be presented the design of an intelligent system destined for development process of demonstrating abilities for geometry theorems. This system will make available to user a proof assistant which will allow interactive vizualization of several demonstrations for the same theorem, demonstrations that have been generated by using three specific methods for automatic demonstration of theorems: area method, full-angle method and inferences accomplishment. For the implementation of the component used to represent knowledge and proof mechanisms will be used Prolog language and for the achievement of geometric construction associated to the theorem will be used Java language.

Keywords: Intelligent Software, Geometry, Java, Prolog, Automatic Demonstration Theorems

Title of the Paper: Design Using UML Diagrams of an Educational Informatics System for the Study of Computational Geometry Elements


Authors: Anca Iordan, Manuela Panoiu

Abstract: This paper presents the necessary stages in implementing an informatics system used for the study of computational geometry elements, such as determining the parallel and perpendicular to a given line through a point, to verify the property that a point lies within a triangle, verification of the convex or concave property of a polygon, the determination of the convex hull of a set of points. The modeling of the system is achieved through specific UML diagrams representing the stages of analysis, design and implementation, the system thus being described in a clear and concise manner.

Keywords: Educational Software, Computational Geometry, Java, Distance Education

Title of the Paper: A Study on Multiple Objects Detection, Loading and Control in Video for Augmented Reality


Authors: Sungmo Jung, Jae-Gu Song, Seoksoo Kim

Abstract: Since the researches on augmented reality have recently received attention, M2M market started to be activated and there are numerous efforts to apply this to the real life in all sectors of society. However, with the existing marker-based augmented reality technology, a designated object only can be loaded from one marker and one marker has to be added to load the same object additionally. To solve this problem, the relevant marker should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. Therefore, in this paper, multiple objects detection and loading using PPHT has been developed. And markers overlapping control according to multiple objects control has been studied using Bresenham and Mean Shift algorithm.

Keywords: Multiple Objects Detection, Multiple Objects Loading, Multiple Objects Control, Computer Vision, PPHT, Bresenham Algorithm, Mean Shift Algorithm, Augmented Reality

Title of the Paper: Research on A-Key Distribution Algorithms for Protecting Data of RS-485-based Industrial Infrastructure


Authors: Jae-Gu Song, Sungmo Jung, Seoksoo Kim

Abstract: The RS-485 protocol is a Modbus serial communication, mainly used by SCADA(Supervisory Control And Data Acquisition). Most industrial infrastructure using this communication protocol is designed not to be exposed to an open network environment, seriously vulnerable to security threats. Therefore, this study examines key management architecture in order to suggest a key exchange algorithm suitable for the RS-485 protocol. The suggested algorithm can support both 1:1 and 1:N communications as well as minimize the amount of loads arising from encoding/decoding so as to increase its application.

Keywords: RS-485, Modbus, SCADA, Security, Infrastructure, Key Distribution Algorithms, Key Management

Title of the Paper: An Approach to QoS based Selection and Composition of Semantic Web Services based upon Multi-Agent Negotiation


Authors: Sandeep Kumar, Nikos E. Mastorakis

Abstract: This paper presents a QoS (Quality of Service) based selection and composition model for semantic web services in which the negotiation is performed with all the discovered service providers and the negotiation results are used in the selection process. In semantic web service composition, the selection can also be performed after negotiating with all the discovered service providers and the negotiation-agreements generated from the negotiation can also be used in the selection process. This paper presents a semantic web service selection and composition model based upon this concept. In this paper, a multi-agent negotiation based semantic web service composition approach has been presented. A negotiation agreement based selection model has also been presented that uses the assessment of the various quality parameters included in the negotiation agreement for rating the service provider agents. The implementation issues in the work has been discussed and the comparative analysis has been performed.

Keywords: Semantic web, negotiation, multi-agent system, selection, composition

Title of the Paper: Application of Unified Smart Classification and Modified Weight Elimination Algorithms to Damage Evaluation in Composite Structures


Authors: Mahmoud Z. Iskandarani

Abstract: Unified Smart Classification Algorithm (USCA) for the purpose of data processing and classification of data obtained from different testing techniques is designed and tested. The developed algorithm conditions data taken from damaged composite structures such as modern car bodies and Plane frame structure. It is used in conjunction with a Modified Weight Elimination Neural Networks Algorithm (MWEA) to provide predictive models for impact damage in composite structures. The developed neural models correlated between various NDT testing techniques, such that in the absence of one technique, its results are predicted by the Neural Network through interrogation of available data obtained using other testing methods. The real and predicted data showed good agreements in terms of classification and prediction.

Keywords: Neural Networks, Classification, Damage, Composites, Algorithm, Prediction, Weight Elimination, Pruning

Title of the Paper: Methodology for SIP Infrastructure Performance Testing


Authors: Miroslav Voznak, Jan Rozhon

Abstract: This paper deals with a testing method suitable for SIP infrastructure. The performance testing is an issue of research and no standardized methodology has been adopted yet. We present the main ideas of the methodology that allows for testing the keystone of SIP based infrastructure – the SIP Server – in both SIP Proxy and B2BUA (Back to Back User Agent) configurations. Our methodology has its foundations in the work of the IT Company Transnexus and these foundations have been enhanced with the ideas reflecting the nature of the SIP protocol. In addition, the entirely new methodology for benchmarking the SIP server in the B2BUA configuration has been introduced. This method utilizes one of the attributes of the B2BUA – the flow of media passing through the B2BUA – and measures the effectiveness of codec translation, which relates to the performance measured in cases without codec translation. Our approach offers the complex method for testing SIP infrastructure, which has been verified experimentally. The outcoming results are the part of this paper together with appropriate comments and conclusions.

Keywords: Asterisk, B2BUA, codec translation, Opensips, Performance testing, SIP Proxy

Title of the Paper: Development of an Effective Assessment and Training Support System for Cognitive Ability for Special Children


Authors: Tan Meng Kuan, Eko Supriyanto, Yeo Kee Jiar, Yap Ee Han

Abstract: In definition, special children include children who are having Down syndrome, autism, global delay, epilepsy, slow learner and others. In this study, the special children are focused on children with Down syndrome. Down syndrome occurs due to an extra copy of chromosome 21 in the children’s chromosome. Early intervention Program (EIP) is a systematic program with therapy, exercises, and activities which designed to help children especially special children. Cognitive development is the construction of thought processes, which is one of the most important skills that have to be developed for Down syndrome children in order to lead a normal life. This support system is focused mainly to help them improving their logical thinking and memory skills. This cognitive assessment and training support system utilizes the radio frequency identification (RFID) technology implemented in C Sharp programming language. The completed system was then tested and feedback was obtained from parents or trainers of Down syndrome children. The results show that the system can generate results in graphical form stably and training for improving the cognitive ability of the children is reliable based on global recognized curriculum. In conclusion, the system can be used in order to help trainers or parents to improve the cognitive ability of children with Down syndrome.

Keywords: Support system, cognitive ability, special children, RFID, Early Intervention Program, Assessment and Training

Title of the Paper: Fuzzy ART for the Document Clustering By Using Evolutionary Computation


Authors: Shutan Hsieh, Ching-Long Su, Jeffrey Liaw

Abstract: Many clustering techniques have been widely developed in order to retrieve, filter, and categorize documents available in the database or even on the Web. The issue to appropriately organize and store the information in terms of documents clustering becomes very crucial for the purpose of knowledge discovery and management. In this research, a hybrid intelligent approach has been proposed to automate the clustering process based on the characteristics of each document represented by the fuzzy concept networks. Through the proposed approach, the useful knowledge can be clustered and then utilized effectively and efficiently. In literature, artificial neural network have been widely applied for the document-clustering applications. However, the number of documents is huge so that it is hard to find the most appropriate ANN parameters in order to get the most appropriate clustering results. Traditionally, these parameters are adjusted manually by the way of trial and error so that it is time consuming and doesn’t guarantee an optimum result. Therefore, a hybrid approach incorporating an evolutionary computation (EC) approach and a Fuzzy Adaptive Resonance Theory (Fuzzy-ART) neural network has been proposed to adjust the Fuzzy-ART parameters automatically so that the best results of the document clustering can be obtained. The proposed approach is tested by using ninety articles in three different fields. The experimental results show that the proposed hybrid approach could generate the most appropriate parameters of Fuzzy-ART for getting the most desired clusters as expected.

Keywords: Documents Clustering, Evolutionary Computation, Fuzzy ART, Knowledge Discovery

Title of the Paper: A Power-Efficient Secure Routing Protocol for Wireless Sensor Networks


Authors: Iman Almomani, Emad Almashakbeh

Abstract: Wireless Sensor Network (WSN) is a hot research area due to its use in many military and civilian applications. WSN consists of distributed small, low power and limited capabilities sensors that are scattered in the network field to sense different parameters in the environment. The sensed data will be sent to a more powerful node called sink node (Base Station). The sink node is usually connected to a power supply and is used to process the data and to connect the sensor network to other networks like the Internet. One of the major challenges in such networks is how to provide connection between the sensors and the sink node and how to exchange the data while maintaining the security requirements and taking into consideration their limited resources in terms of energy, memory and available bandwidth. In this paper a power-efficient, secure routing protocol is proposed to help managing the resources in WSN networks. The proposed protocol is a hybrid of two major categories of protocols in WSNs, namely tree-based and cluster-based protocols. The proposed protocol is combined with a Fuzzy Logic inference system to aid in the selection of the best route based on a combination of three factors: the path length, the available power and the node reputation resulted from the Intrusion Detection System (IDS). The proposed protocol uses three Fuzzy Inference Systems (FIS) that are implemented in two tiers. Tier one will choose the best route in terms of shortest length and high power. Tier two provides a security assessment for the selected route.

Keywords: Wireless Sensor Network (WSN), Security; Routing, Power Saving, Clustering, Intrusion Detection System (IDS), Fuzzy Logic

Title of the Paper: Sound-Colour Synaesthesia, Chromatic Representation of Sounds Waves in Java Applets


Authors: Stela Dragulin, Livia Sangeorzan, Mircea Parpalea

Abstract: The paper herein presents a way in which sound waves are chromatically represented in Java language. It intends to be, as its’ title reveals, an instructional material providing help in developing interactive tools for efficient education in sciences, especially in the field of acoustics. The developed java applets illustrate the main aspects regarding to sound phenomena. It realizes a direct correspondence between visible frequencies of the light and frequencies of sound waves via a sound wave frequency range transformation into a linear scale. The construction of the natural light spectrum was explained in detail, indicating the source code used in applets. Java written software was developed for converting sound wave intensities in colour saturation coefficients. The application allows further development in order to generate a colour visual interpretation of the musical atmosphere and also to develop the performer’s creativity. The article addresses the reality in a sensitive manner creating in an undifferentiated manner bridges between face colours and the artistic sensitivity of the human hearing.

Keywords: RGB light, Java Applets, sounds waves, music, education

Title of the Paper: Comparison of a Crossover Operator in Binary-Coded Genetic Algorithms


Authors: Stjepan Picek, Marin Golub

Abstract: Genetic algorithms (GAs) represent a method that mimics the process of natural evolution in effort to find good solutions. In that process, crossover operator plays an important role. To comprehend the genetic algorithms as a whole, it is necessary to understand the role of a crossover operator. Today, there are a number of different crossover operators that can be used in binary-coded GAs. How to decide what operator to use when solving a problem? When dealing with different classes of problems, crossover operators will show various levels of efficiency in solving those problems. A number of test functions with various levels of difficulty has been selected as a test polygon for determine the performance of crossover operators. The aim of this paper is to present a larger set of crossover operators used in genetic algorithms with binary representation and to draw some conclusions about their efficiency. Results presented here confirm the high-efficiency of uniform crossover and two-point crossover, but also show some interesting comparisons among others, less used crossover operators.

Keywords: Evolutionary computation, Genetic algorithms, Crossover operator, Efficiency, Binary representation, Test functions

Issue 10, Volume 9, October 2010

Title of the Paper: Probabilistic Model for Accuracy Estimation in Approximate Monodimensional Analyses


Authors: Carlo Dell'Aquila, Francesco Di Tria, Ezio Lefons, Filippo Tangorra

Abstract: Approximate query processing is often based on analytical methodologies able to provide fast responses to queries. As a counterpart, the approximate answers are affected with a small quantity of error. Nowadays, these techniques are being exploited in data warehousing environments, because the queries devoted to extract information involve high-cardinality relations and, therefore, require a high computational time. Approximate answers are profitably used in the decision making process, where the total precision is not needed. Thus, it is important to provide decision makers with accuracy estimates of the approximate answers; that is, a measure of how much reliable the approximate answer is. Here, a probabilistic model is presented for providing such an accuracy measure when the analytical methodology used for decisional analyses is based on polynomial approximation. This probabilistic model is a Bayesian network able to estimate the relative error of the approximate answers.

Keywords: Analytic query processing, Approximate query answer, Polynomial approximation, Accuracy estimation, Probabilistic model

Title of the Paper: The Relative Potential Field as a Novel Physics-Inspired Method for Image Analysis


Authors: X. D. Zhuang, N. E. Mastorakis

Abstract: In this paper, the relative potential field is proposed as a novel image transform inspired by the physical electro-static field. A general form of image potential field is presented, based on which the relative potential is defined by introducing the factor of gray-scale difference into the potential field. The properties of the relative potential are investigated experimentally and analyzed, based on which an image segmentation method is proposed by region division and merging in the relative potential field. Experimental results prove the effectiveness of the proposed image segmentation method, and also indicate the promising application of the relative potential in image processing tasks.

Keywords: Relative potential field, electro-static, image transform, image segmentation

Title of the Paper: A Study on the Feasibility of the Inverse Maximum Flow Problems and Flow Modification Techniques in the Case of Non-Feasibility


Authors: Adrian Deaconu, Eleonor Ciurea, Corneliu Marinescu

Abstract: The feasibility of the inverse maximum flow problems (denoted IMFG) is studied. The feasibility can be tested in linear time. In the case of IMFG not being feasible, a new inverse combinatorial optimization problem is introduced and solved. The problem is to modify as little as possible the flow so that the problem becomes feasible for the modified flow. An example is presented.

Keywords: Inverse optimization, maximum flow

Title of the Paper: Application of Genetic Algorithm for Designing Cellular Manufacturing System Incrementally


Authors: J. Rezaeian, N. Javadian, R. Tavakkoli-Moghaddam

Abstract: One important issue regarding the implementation of cellular manufacturing systems relates to deciding whether to convert an existing job shop into a cellular manufacturing system comprehensively in a single go, or in stages incrementally by forming cells one after the other taking the advantage of the experiences of implementation. In this paper two heuristic methods based on multi-stage programming and genetic algorithm are proposed for incremental cell formation. The results show that the multi-stage programming solves small problems faster than exact algorithms such as branch and bound. A heuristic procedure based on genetic algorithm is developed on the multi-stage programming to test larger problem sizes.

Keywords: Incremental cell formation, Cellular manufacturing system, Multi-Stage programming, Genetic algorithm, Job shop, Comprehensive cell formation

Title of the Paper: Romanian Black Sea Resorts. Study on the Summer Offers


Authors: Mirela-Catrinel Voicu

Abstract: In Romania, the Black Sea seaside is one of the most popular places for the summer holidays. In this paper we present a database model including information about resorts, hotels, hotel star classifications, hotel room types, room amenities, hotel facilities, the holiday time period, the number of days for a holiday stay, meal types, rates for accommodation, etc. We built this database model based on data presented in travel agencies' catalogues. To explore the database data, we apply algorithms of aggregation. Our study is focused on detecting the best rates (depending on tourist preferences) for the summer holiday according to some other features.

Keywords: Tourism, seaside summer offer, database, algorithms

Title of the Paper: Electronic Tools for Support of Strategic Human Resource Management


Authors: Elissaveta Gourova, Kostadinka Toteva

Abstract: The present paper considers the importance of human resources management for the success of organisations in the knowledge society. It provides a short theoretical background on commitment and motivation of employees and the importance of human resource management for meeting organisational goals and achieving its strategy. The deployment of information and communication technologies in this process facilitates the work of managers, as well as provides them with better tools for enhancing employees’ articipation and involvement in the organisations processes. The paper presents new electronic tools developed as an extension to existing human resources management software, aimed at collecting objective and subjective feedback from employees. An example of the application of these tools is provided and an analysis of the results of the improvement of human resources management in the organisation.

Keywords: Software applications, employees’ commitment, motivation, human resources management

Title of the Paper: HDS: a Software Framework for the Realization of Pervasive Applications


Authors: Agostino Poggi

Abstract: Nowadays pervasive computing is one of the most active research fields because it promises the creation of environments where computing and communication devices are gracefully integrated with users so that applications can provide largely invisible support for tasks performed by users. This paper presents a software framework, called HDS (Heterogeneous Distributed System), that tries to simplify the realization of pervasive applications by merging the client-server and the peer-to-peer paradigms and by implementing all the nteractions among the processes of a system through the exchange of typed messages and the use of composition filters for driving and dynamically adapting the behavior of the system. Typed messages and computational filters are the elements that mainly characterize such a software framework. In fact, typed messages can be considered an object-oriented “implementation” of the types of message defined by an agent communication language and so they are means that make HDS a suitable software framework both for the realization of multi-agent systems and for the reuse of multi-agent model and techniques in non-agent based systems. Composition filters drive and adapt the behavior of a system by acting on the exchange of messages. In fact, on the one hand, composition filters can constrain the exchange of messages (e.g., they can block the sending/reception of some messages to/from some processes), they can modify the flow of messages (e.g., they can redirect some messages to another destination) and they can manipulate messages (e.g., they can encrypt and descript messages). On the other hand, processes can dynamically add and remove some composition filters to adapt the behavior of a system to any hardware and software new configuration and to any new user requirement.

Keywords: Typed messages, Composition filters, Software framework, Pervasive systems, Multi-agent systems, Java

Title of the Paper: An Algorithm for Determination of the Guillotine Restrictions for a Rectangular Cutting-Stock Pattern


Authors: Daniela Marinescu, Alexandra Baicoianu

Abstract: Starting from a two-dimensional rectangular Cutting-Stock pattern with gaps, this paper is focuses on the problem of determination if the pattern is with guillotine restrictions or not and proposes an algorithm for solving it. First we present two new graph representations of the cutting pattern, weighted graph of downward adjacency and weighted graph of rightward adjacency. Using these representations we propose a method to verify guillotine restrictions of the pattern which can be apply for cutting-stock pattern with gaps but also for the cutting or covering pattern without gaps and overlapping.

Keywords: Two-dimensional cutting-stock problems, cutting pattern representation, guillotine restrictions

Title of the Paper: Modelling of Web Domain Visits by IF-Inference System


Authors: Vladimir Olej, Petr Hajek, Jana Filipova

Abstract: This paper presents basic notions of web mining and fuzzy inference systems based on the Takagi-Sugeno fuzzy model. On the basis of this fuzzy inference system and IF-sets introduced by K.T. Atanassov novel IF-inference systems can be designed. Thus, an IF-inference system is developed for time series prediction. In the next part of the paper we describe web domain visits prediction by IF-inference systems and the analysis of the results.

Keywords: Web mining, IF- inference system, fuzzy inference system, web domain visits, prediction, time series

Title of the Paper: Grid Infrastructure Development as Support for e-Science Services


Authors: Gabriel Neagu, Alexandru Stanciu

Abstract: During last years the e-Infrastructure concept has been extensively elaborated and promoted as an essential pillar for the implementation of ERA (European Research Area). The main components of this infrastructure are the network for education and research and the Grid infrastructures. At the European level important efforts and resourced have been devoted to the development of both components, with joint financial support provided by Framework Programmes 6 and 7 of EU and national research programmes. At the national level, the high-speed communication network for education and research, and national grid infrastructure for esearch are included in the list of priority investment projects in the Information and Communications Technologies domain. The main financial support for the development of these projects is provided by the Sectorial Operational Programme “Increase of Economic Competitiveness”. ICI Bucharest is among beneficiary organizations of this support. This paper presents the European context for e-Infrastructure development as support for e-Science concept implementation, the evolution and current status of Grid infrastructures at the European and national levels, a short review of the development of Grid computing at ICI Bucharest, the main objectives and activities of the structural funds grant awarded to the institute for a significant upgrade of the current grid site RO-01-ICI.

Keywords: E-Infrastructure, e-Science, Grid infrastructure, EGEE, SEE-GRID, EGI, RoGrid, GridMOSI, Virtual organization, RO-01-ICI grid site

Title of the Paper: RC5-Based Security in Wireless Sensor Networks: Utilization and Performance


Authors: Juha Kukkurainen, Mikael Soini, Lauri Sydanheimo

Abstract: Information transferred in a wireless sensor network can be sensitive, hence it is vital to secure the data on the network. Enhancing the network’s security affects the network operation, by increasing the data’s handling time. In this paper, the trade-offs between enhanced security and sensor network performance is discussed. The studies concentrate on the increase in computation time and energy consumption as enhanced security features and levels are utilized. This paper presents, RC5-based encryption and CMAC authentication, used to achieve data confidentiality, freshness, replay protection, authentication, and integrity. These features enhance data security but can also decrease sensor network operability, because of the added load in computation and communication. By selecting a suitable algorithm and operation conditions for encryption and authentication, the data security in wireless sensor networks can be improved with minor resource losses.

Keywords: Consumption, KILAVI sensor network platform, performance trade-offs, wireless sensor network security

Title of the Paper: Generating Logic Representations for Programs in a Language Independent Fashion


Authors: Ciprian-Bogdan Chirila, Calin Jebelean, Titus Slavici, Vladimir Cretu

Abstract: In today’s software engineering program analysis and program transformation are operations that strongly rely on software models. One important share in this direction is held by logic based models, described in a declarative language such as Prolog. There are some approaches used to represent information about software systems while at the same time preserving the logic relations between entities, but they are normally limited to software systems written in a certain programming language. There are also language independent approaches to logic based representation of programs, but they are usualy based on syntactic information about the modeled program and provide little information about the logic relations between entities at the semantic level. This paper describes a methodology that would unite the two kinds of approaches, being both language independent and expressive at the semantic level at the cost of a more complex generation process.

Keywords: Program transformation, semantical actions , metamodel conforming logic representation

Title of the Paper: SVM-based Supervised and Unsupervised Classification Schemes


Authors: Luminita State, Iuliana Paraschiv-Munteanu

Abstract: The aim of the research reported is to propose a training algorithm for support vector machine based on kernel functions and to test its performance in case of non-linearly separable data. The training is based on the Sequential Minimal Optimization introduced by J.C. Platt in 1999. Several classifications schemes resulted by combining the SVM and the 2-means methods are proposed in the fifth section of the paper. A series of conclusions derived experimentally concerning the comparative analysis of the performances proved by the proposed methods are summarized in the final part of the paper. The tests were performed on samples randomly generated from Gaussian two-dimensional distributions, and on data available in Wisconsin Diagnostic Breast Cancer Database.

Keywords: Support Vector Machine, Pattern recognition, Statistical learning theory, Kernel functions, Principal Components Analysis, k-means Algorithm

Title of the Paper: Identifying the Course Network Structures Based upon Records of In-service Learning Database


Authors: Lung-Hsing Kuo, Hsieh-Hua Yang, Jui-Chen Yu, Hung-Jen Yang, Lydia Lin

Abstract: The purpose of this study focused on the social network structure of in-service development for the Chinese literature expertise teachers in Taiwan High school. Social network analysis methods include the core measures, identification of groups to analyze the role, figure based on the statistical analysis and so on. In this study we use 2009 National Teacher in-service Education Information Network ( database to obtain 5,417 sample resources. The expected result should be able to understand Chinese literature teacher in-service refresher training expertise in the overall network of centricity, teachers participate in studies course subject categories and countries connected, education seminar for subgroups generated number, subgroups on overall network impact, etc.

Keywords: In-service development, Social network structure, High school, UCINET, Chinese literature teacher

Title of the Paper: A Framework for Schema Matcher Composition


Authors: Balazs Villanyi, Peter Martinek, Bela Szikora

Abstract: Enterprise schemas tend to be different, which is the key issue when the seamless communication between systems is of utmost importance. One solution could be the development of standards which then could be enforced, however, vendors seem to be reluctant to comply with them and communication between existing and legacy systems still remains unsolved. Other solution could be schema matching, which resolves the matter on data level and the process do not require vendors to adhere to any kind of predefined schemas. The task is very complex on the other hand, even for human evaluators. Some of the solutions aired so far are fairly promising, however, their accuracy varies. Our goal was to find means by which the results could be enhanced. We have been focusing on the development of solutions which do not change the concept of the algorithms, but fine-tune them so that they achieve higher accuracy. Our experiments showed that the results of the matchers may vary on a large scale depending on the actual parameter settings. It has also turned out that the parameters should set for each scenario individually, as the best results are warranted only this way. In this article, we present a general approach for optimally dissembling existing solutions, and combining some of the resulting components in a way that the new matcher supersedes the donor ones. The composition and the optimal parameter setting combined provide a framework, which is capable of an enhanced performance. Improved accuracy lessens the need for the follow-up human supervision.

Keywords: Schema matching, optimization, algorithm analysis, performance improvement, framework definition

Issue 11, Volume 9, November 2010

Title of the Paper: Assessment of Evapotranspiration Using Remote Sensing Data and Grid Computing and Application


Authors: Cristina Serban, Carmen Maftei, Alina Barbulescu

Abstract: The estimation of evapotranspiration (ET) plays an essential role in all activities related to water resources management. In this study we developed a procedure to calculate the spatial distribution of evapotranspiration using remote sensing data. What distinguishes it from other procedures is the use of Grid Computing technique to calculate the evapotranspiration at regional scale and to manage the distributed data sets. One other merit is that it requires a small number of inputs, all other data such net radiation, soil heat flux, albedo, etc., were obtained from remote sensing data. This procedure calculates also the NDVI (Normalized Difference Vegetation Index) and LST (Land Surface Temperature) and offer as a result the corresponding thematic maps. A Web-based client interface has been also built in order to provide the application with Internet-based accessibility. The procedure has been applied to the Dobrogea region located in the south-east of Romania, using a Landsat ETM+ image acquired in 7th June 2000. The evapotranspiration was retrieved during satellite overpass and integrated for 24 hours on pixel-by-pixel base for daily estimation. The estimated evapotranspiration has been compared with local measurements. The results obtained are promising and appear to provide acceptable estimation over the study area.

Keywords: Remote sensing, Grid computing, evapotranspiration, land surface temperature

Title of the Paper: Municipal Revenue Prediction by Ensembles of Neural Networks and Support Vector Machines


Authors: Petr Hajek, Vladimir Olej

Abstract: Municipalities have to to pay increasing attention to the importance of revenue prediction due to fiscal stress. Currently, judgmental, extrapolative, and deterministic models are used for municipal revenue prediction. In this paper we present the designs of neural network and support vector machine ensembles for a real-world regression problem, i.e. prediction of municipal revenue. Base learners, as well as linear regression models are used as benchmark methods. We prove that there is no single ensemble method suitable for this regression problem. However, the ensembles of support vector machines and neural networks outperformed the base learners and linear regression models significantly.

Keywords: Municipal revenue, prediction, regression, support vector machine ensembles, modelling, neural network ensembles

Title of the Paper: Performance Evaluation of Intel's Quad Core Processors for Embedded Applications


Authors: Jareer H. Abdel-Qader, Roger S. Walker

Abstract: Recently, multiprocessing is implemented using either chip multiprocessing (CMP) or Simultaneous multithreading (SMT). Multi-core processors, represent CMP processors, are widely used in desktop and server applications and are now appearing in real-time embedded applications. We are investigating optimal configurations of some of the available multi-core processors suitable for developing real-time software for a multithreaded application used for pavement performance measurements. For the application discussed in this paper we are considering the use of either the Intel core 2 quad or the core i7 (a quad core processor with hyper threading (HT) technology.) Processor performance is a major requirement in this set of real-time, computational intensive embedded applications. The performance of both processors is measured and evaluated using single and multithreaded workloads supplied by different benchmark suites. As for the core i7 processor we also provide an evaluation for the HT technology implemented in each core of this processor.

Keywords: Quad core processors, Multi-threading, Performance Evaluation, Benchmark, Memory bandwidth, Memory Latency

Title of the Paper: Design and Implementation of Real-Time EP80579 Based Embedded System


Authors: Jareer H. Abdel-Qader, Roger S. Walker

Abstract: With the advances in integration of different units such as I/O controllers and network interfaces in a single chip, Intel introduced the low power EP80579 embedded processor. This processor is the first IA based system-on-chip (SoC) with an IA-32 processor core, North and South Bridges, and integrated Accelerator and network interface. In this paper we will show main steps to design a real-time embedded system along with identifying the hardware and software requirements to implement such system. The embedded system introduced in this work will perform several tasks regarding road surface conditions based on multiple sensor readings. The sensor data will be processed in real-time to reconstruct the road profile and provide an estimate for the texture contents of the road surface. The EP80579 SoC will be used in the design of such a system. Modeling will be done with the aid of UML profile for modeling and analysis of real-time and embedded (MARTE) systems.

Keywords: System-on-a-chip, Modeling Real-time systems, parallel processing modeling, MARTE, Embedded Application, Road Profiler

Title of the Paper: Supplying Goods and Materials to the Offshore Islands Using ILP


Authors: Yu-Cheng Lin, Shin-Jia Chen, Ping-Liang Chen, Jui-Jung Liu, Fan-Chen Tseng, La-Ting Song

Abstract: Many of problems are proved to be NP-complete or NP-hard problems. To solve these problems and to obtain the optimal solutions, it is usual to transfer the problems to Integer Linear Programming formulations. Coast Guard Administration (CGA), an agency intermingles jurisdiction over both coastal and oceanic affairs, CGA is in charge of harbor security check, resolving fishery conflict, investigating smuggling, and illegal immigration or emigration on various outlying islands, including Kinmen, Mazu, Penghu, Pengjiayu, Guishandao, Xiaoliuqiu, Ludao and Lanyu. To supply livelihood materials, including drinking water, rice, and oil, for those officers station on different outlying islands, becomes a critical issue. In the paper, we will discuss the demand of the rice and the drinking water for the supply of the people on the set of large islands. This research is to explore and assess the model of supply of livelihood material to those outlying islands. By calculating and fitting an optimally supplying model, this research expects that such supplying model can decrease the budget or cost of supply and improve the quality of livelihood materials for the stationing officers on outlying islands.

Keywords: Integer Linear Programming, Supplying, Coast Guard Administration, Offshore islands

Title of the Paper: The Effect of Training Set Size for the Performance of Neural Networks of Classification


Authors: Hyontai Sug

Abstract: Even though multilayer perceptrons and radial basis function networks belong to the class of artificial neural networks and they are used for similar tasks, they have very different structures and training mechanisms. So, some researchers showed better performance with radial basis function networks, while others showed some different results with multilayer perceptrons. This paper compares the classification accuracy of the two neural networks with respect to training data set size, and shows the performance of the two neural networks can be differently dependent on training data set size. Experiments show the tendency that multilayer perceptrons have better performance in relatively larger training data sets for some data sets, even though radial basis function networks have better performance in relatively smaller training set size for the same data sets. The experiment was done with four real world data sets.

Keywords: Neural networks, multilayer perceptrons, radial basis function networks, training data set, prediction

Title of the Paper: A Semantic Schema - based Approach for Natural Language Translation


Authors: Mihaela Colhon, Nicolae Tandareanu

Abstract: Processing natural language statements to obtain equivalent translations in a different language has long been an area of research in Artificial Intelligence. In machine translation systems, an intermediate representation of inputs is necessary to express the result of the phrase analysis. These representations treat each phrase as a character string and construct the corresponding syntactic and/or semantic representation structure. In the proposed approach, representation is made by means of a semantic network type structure, named semantic schema, with focus on the dependency relations existing between the sentence words. The resulted schema components are further evaluated (using an interpretation system) with the corresponding constructions from the language into which translation occurs.

Keywords: Semantic Schema, Natural Language, Machine Translation, Morpho-syntactic data, Dependency Relationships

Title of the Paper: Knowledge Processing in Contact Centers using a Multi-Agent Architecture


Authors: Claudiu Ionut Popirlan

Abstract: The explosion of multimedia data, the continuous growth in computing power, and advances in machine learning and speech and natural language processing are making it possible to create a new breed of virtual intelligent agents capable of performing sophisticated and complex tasks that are radically transforming contact centers. These virtual agents are enabling ubiquitous and personalized access to communication services from anywhere. As contact centers grow and become more complex in their function and organization, the knowledge processes become more formal to ensure consistency of advice and efficiency. This paper suggests a multi-agent approach for distributed knowledge processing and discusses the use of enhanced mobile agent architecture (EMA) [12] in context of contact centers to advance and frame future discussion of these knowledge intensive environments. We prove the benefits of the enterprise resource planning (ERP) management using multi-agent systems for contact centers with distributed knowledge, considering an adequate case study and providing experimental results.

Keywords: Multi-Agent Systems, Software Agent, Mobile Agent, Contact Centers, Distributed Knowledge, Enterprise Resource Planning (ERP)

Title of the Paper: A Computerized Technique for Multi-Criteria Analysis of Seismic Disasters - GESKEE Database


Authors: Cosmin Filip, Cristina Serban, Mirela Popa, Gabriela Draghici

Abstract: Econometric scaling approach of vulnerability and risk on building stock to major seismic events is essential for understanding the impact and the consequences of these earthquakes. It also has a predictive value for loss estimation of next events. Due to use of a large amount of information in such multi-criteria approaches, it is a high necessity to develop computerized techniques to automate the calculation process, to manage the data easier and also to represent the results in a manner as suggestive as possible. Therefore, the paper presents the GESKEE (Global Econometric Scaling using Knowledge on Earthquake Effects) Database developed by authors which is an indispensable tool for econometric scaling approach for seismic risk assessment. For this aim, the database was structured starting from special requirements imposed by GESKEE Disaster Scale (2010 version). As a result, according to necessity of econometric calculation and graphical representation of GESKEE Disaster Scale (2010 version), a flexible and helpful database (GESKEE Database) was developed. Actually, building up this Database was the main key to improving, developing, updating and to a better interpretation of the GESKEE Disaster Scale (from 1998 version to 2010 version). The results are promising and appear to provide an increased predictive value for the GESKEE Disaster Scale. We consider that this application will be greatly useful and convenient for a transparent and a more efficient seismic risk assessment and loss scaling, providing a better understanding of past seismic events impact and so, a real help in calibration of earthquake disaster prevention policies. In order to point out the innovative edge of GESKEE Database, which is an important tool in social and economic quantification of seismic vulnerability of building stocks and earthquake loss, we present some examples of graphic representations and also some interpretations of the results.

Keywords: GESKEE Database, GESKEE Disaster Scale, seismic risk assessment, econometric scale, Earthquake, seismic vulnerability, disaster mitigation

Title of the Paper: A New System Architecture for Flexible Database Conversion


Authors: Siti Z. Z. Abidin, Suzana Ahmad, Wael M. S. Yafooz

Abstract: Much research has been undertaken to work on database sharing, integration, conversion, merging and migration. In particular, database conversion has attracted researchers’ attention due to the rapid change in the computer technology. There are also several tools available on the Web for free usage on handling database conversion. All of these works are focusing on the relational database which consists of an integrated collection of logically related records. Database plays an important role in an area of computing when there is a lot of data and information need to be stored and retrieved. Today, databases are used in many disciplines such as business, education, general administration, medicine and many more. Research in database works have been advanced from file management system to data warehousing with the discussion of how a database can be significantly sustainable and have the potential to be added and modified to suit the current situation and technology. In this paper, we propose a new method on doing database conversion by providing a single master database that can accept multiple databases of any type through the use of Java Database Connectivity (JDBC) and application program interface (API). The key contribution of this method is the ability to accept a single record, multiple records or the whole records of a database to be converted into any other database type. Thus, any existing form of database can be integrated and updated without the need to design new database system for coping with the new technology. In this way, the old or existing databases can be used for an unlimited lifetime and a broader scope of application domains.

Keywords: Database, Data sharing, Integration, Conversion, Access, Integrity, Migration

Title of the Paper: Threats to Voice over IP Communications Systems


Authors: Miroslav Voznak, Filip Rezac

Abstract: This article deals with various techniques of VoIP attacks and VoIP communication security. These threats are divided in several categories according to their specific behaviour and their impact on the affected system. We describe effective methods to prevent or mitigate these attacks. Our work was especially focused on Spam over Internet Telephony (SPIT) as a real threat for the future and we consider spam in IP telephony to be very serious due to situation we face in e-mail communication. We have developed both an AntiSPIT tool defending communication systems against SPIT attacks and a tool generating SPIT attacks. AntiSPIT represents an effective protection based on statistical blacklist and works without participation of the called party which is significant advantage, AntiSPIT was implemented into Asterisk open-source platform and became one way of protection against the voice spam expected in near future.

Keywords: IP telephony, DoS, Security, Attack, AntiSPIT, Asterisk

Title of the Paper: Analysis of Intra-Person Variability of Features for Off-line Signature Verification


Authors: Bence Kovari, Hassan Charaf

Abstract: One of the major challenges in off-line signature verification is the fact that a person’s own signature is influenced by a number of external and internal factors. This influence results in a high variability even between signatures written by the same signer. This paper proposes a method which is able to model the intra-person variability of a signature feature and also to identify and eliminate the effects of external factors. To demonstrate the efficiency of the algorithm, a sample signature verifier is constructed and evaluated on the Signature Verification Competition 2004 database. Experiments have shown that by using 3 features (endings, loops and skew vectors) an average error rate of 12% can be achieved by the system. These results may be further improved by increasing the number of features, used during the comparison of signatures.

Keywords: Signature verification; off-line; classification, normal distribution

Title of the Paper: Identifying the Technology Trend of Visual Language Researches


Authors: Yu L. Gingchi, Shih-Fann Chao, Hung-Jen Yang

Abstract: The major developments of visual language enhancement and brain visualize that will change our future knowledge of ecological but also will lead us to a glimpse into the brain, mysterious performance. Knowledge of graphic, drawing, information design, visual communication, visual language, and visual literacy was important for our ability to better produce effective messages. In this study, visual learning would be explored by identifying characteristics of visual language and its converging technology. According to theories of visual perceptual cognitive and visualizes’ attributes, text design, representation, image design, graphic design and brain processes were discussed. The goals of visual language and recording brain activity were identified based upon meta-analysis.

Keywords: Visual Language, Visual Converging Technology, Brain Technology

Issue 12, Volume 9, December 2010

Title of the Paper: A New Image Edge Detection Method Inspired from Biological Visual Cortex


Authors: Zuojin Li, Liukui Chen, Hongpeng Yin, Jia Yu

Abstract: Gabor function can be expressed as the alternate dual-function of the positioning of the characteristics and selectivity of the spatial frequency of the receptive field, which can effectively describe the property of the receptive field of the simple cells in human visual cortex. However, the existing literature provides no strict mathematical proof for this fact. This work integrates Gabor function into the integral framework, and proves that the integral transformation of Gabor nuclear possesses the quality of responding to the edge stimuli as the visual cortex. Experiments show that the integral transformation can extract the features of image edge.

Keywords: Gabor function, Integral transformation, Edge detection

Title of the Paper: Performance Analysis of Maintaining Mobile-Based Social Network Models


Authors: Peter Ekler, Mark Asztalos

Abstract: Although social networks are becoming more and more important in everyday life, their support is still not satisfactory in mobile devices. The automatic synchronization of the phone books in mobile phones with the data of the members of the social networks needs special and efficient algorithms. These algorithms cover the management of the global network model and the client software that runs on different mobile devices. In this work, we present Phonebookmark, a mobile-based social network implementation, which has been tested by hundreds of registered members. We present the mathematical model of the system in general that makes formal analysis possible. We introduce algorithms for the automatic discovery of similarities between contacts and members and for maintaining these similarities in the global models. We show an estimation of the complexity of the presented algorithms, which enables to extend other existing social networks by capabilities for supporting mobile devices as specified in this paper.

Keywords: Social networks, Modeling, Graph transformation, Complexity estimation, Mobile phones, Phonebook

Title of the Paper: Effect of Serialized Routing Resources on the Implementation Area of Datapath Circuits on FPGAS


Authors: Sebastian Ip, Andy Gean Ye

Abstract: In this work, we investigate the effect of serialization on the implementation area of datapath circuits on FPGAs. With ever-increasing logic capacity, FPGAs are being increasingly used to implement large datapath circuits. Since datapath circuits are designed to process multiple-bit wide data, FPGA routing resources, which typically consist of a significant amount of FPGA area, are routinely being used to transport multiple-bit wide signals. Consequently, it is important to design efficient routing architectures for transporting multiple-bit wide signals on FPGAs. Serialization, where several bits of a signal are first time-multiplexed and then transported over a single wire, has been effectively used to increase the I/O bandwidth of FPGAs. Recent work has proposed to use serialization to increase the area efficiency of FPGA routing resources for transporting multiple-bit wide signals. Most of the work, however, has focused on circuit-level design issues. Little work has been done on the overall effect of serialization on the area efficiency of FPGAs. In this work, we investigate the overall effect of serialization on the area efficiency of FPGAs. We propose a detailed FPGA routing architecture, which contains a set of serialization routing resources, and its associated routing tool. Using the architecture and the tool, we measure the effect of serialization on active area and track count. We found that, for benchmarks that contain four-bit wide datapath circuits, serialization can achieve a maximum active area reduction of 6.4% and a routing track reduction of 29%.

Keywords: Field-Programmable Gate Arrays, Serial Routing Resources, Routing, Area Efficiency

Title of the Paper: Toward a Unique Person Identifier Model in the Slovak Republic


Authors: Ladislav Huraj

Abstract: A national identification number of natural persons is used by the governments of many countries in various ways. The main objective is to improve security in online services and unify authentication. In the Slovak Republic, National identification number based on birth date is currently used, which does not correspond with EU legislation. In our article we describe our new prototypes of Unique Person Identifier prepared on the basis of request of the Ministry of Interior of the Slovak Republic. The scheme is based on the idea of the Austrian system. Moreover, an introduction of proposal of an electronic system for identity management (IDM) in the Slovak Republic is presented.

Keywords: Identification, Unique person identifier, Sector-specific personal identifier, Meaningless identifier


[Journals], [Books], [Conferences], [Research], [Contact us], [Indexing], [Reports], [E-Library], [FAQ],
[Upcoming Conferences with Expired Deadline], [History of the WSEAS Conferences]

Copyright © WSEAS