WSEAS CONFERENCES. WSEAS, Unifying the Science

Main Page of the Journal                                                           Main Page of the WSEAS


 Volume 8, 2009
Print ISSN: 1109-2750
E-ISSN: 2224-2872








Issue 1, Volume 8, January 2009

Title of the Paper: Industrial Machinery Optimization and Maintenance System via World Wide Web


Authors: A. Etxebarria, R. Barcena, J. J. Valera

Abstract: Nowadays, it is frequent that engineering companies use Internet as the main working environment for the complementary services offered to their customers. Furthermore, Internet allow the development of powerful tools dedicated to the remote maintenance, on-line supervision and performance optimisation of the industrial machinery. In this paper, a extensive set of complementary services via Internet denominated IngeRASTM that provide rapid and effective access to industrial machinery, supplied by the Basque company Ingelectric S. A., is described. Additionally, a new remote system dedicated to the hardware-in-the-loop (HIL) experimentation concerning the optimisation of the performance of industrial controllers installed by the company, is presented. Such a system, now under construction, exploits the rapid prototyping strategy of real-time computer controlled systems and is expected to be added to IngeRASTM in the near future in order to be used remotely by the engineers by means of virtual instrumentation.

Keywords: Remote experimentation, Teleoperation, Virtual instrumentation, Internet, Telelaboratory, Realtime control

Title of the Paper: A Dynamic-Balanced Scheduler for Genetic Algorithms for Grid Computing


Authors: A. J. Sanchez Santiago, A. J. Yuste, J. E. Munoz Exposito, S. Garcia Galan, J. M. Maqueira Marin, S. Bruque

Abstract: The new paradigm of distributed computation, grid computing, has given rise to a large amount of research on resource scheduling. Unlike the distributed computation, grid computing uses heterogeneous resources, for what grid computing entails new challenges as the adaptation of parallel algorithms before developed for homogeneous resources cluster to the dynamic and heterogeneous resources. In this paper we present a dynamic-balanced scheduler for grid computing that solves two typical kinds of problems of grid computing, using for them the cycles of some resources of the grid. The first problem is based on iterative tasks that usually appear in optimization problems. The second problem is a directed acyclic graph (DAG) problem. Experimental results using dynamic-balanced scheduler show that it is possible to obtain an improved use of the resources in the grid. This strategy enables to adapt the length of a task to the computing capacity of each resource at any given moment. Furthermore, this scheduling strategy enables to execute all the tasks in a shorter time.

Keywords: Grid computing, dynamic-balanced scheduler, genetic algorithm, optimization problem

Title of the Paper: The Software Quality Economics Model for Software Project Optimization


Authors: Ljubomir Lazic, Amel Kolasinac, Dzenan Avdic

Abstract: There are many definitions of quality being given by experts that explains quality for manufacturing industry but still unable to define it with absolute clarity for software engineering. To enable software designers to achieve a higher quality for their design, a better insight into quality predictions for their design choices is given. In this paper we propose a model which traces design decisions and the possible alternatives. With this model it is possible to minimize the cost of switching between design alternatives, when the current choice cannot fulfill the quality constraints. With this model we do not aim to automate the software design process or the identification of design alternatives. Much rather we aim to define a method with which it is possible to assist the software engineer in evaluating design alternatives and adjusting design decisions in a systematic manner. As of today there is very little knowledge is available about the economics of software quality. The costs incurred and benefits of implementing different quality practices over the software development life cycle are not well understood. There are some prepositions, which are not being tested comprehensively, but some useful Economic Model of Software Quality Costs (CoSQ) and data from industry are described in this article. Significant research is needed to understand the economics of implementing quality practices and its behaviour. Such research must evaluate the cost benefit trade-offs in investing in quality practices where the returns are maximized over the software development life cycle. From a developer’s perspective, there are two types of benefits that can accrue from the implementation of good software quality practices and tools: money and time. A financial ROI calculation of cost savings and the schedule ROI calculation of schedule savings are given.

Keywords: Software Quality, Quality Cost Model, TQM, Cost optimization, ROI calculation

Title of the Paper: Realization of E-University for Distance Learning


Authors: Hazem M. El-Bakry, Nikos Mastorakis

Abstract: In previous work [27], the authors proposed an E-University but they did not take into account some essential parts. In this paper, the system presented in [27] is modified and developed. New critical important items such as security in E-Learning, learning management, business continuity management and science park are added to the proposed university. According to the great development of IT, the current Web-based learning systems need to be as effective as human tutors. Recently, intelligent agents became one of the most interesting subjects of modern information technology. Agent-based technology has been taken as an important approach for developing advanced E-Learning systems. This paper presents architecture for implementing a multi-agent system within the context of a learning environment. The roles of intelligent agents within an E-Learning system, called E-University, are presented. The agents perform specific tasks on the behalf of students, professors, administrators, and other members of the university. Also a group of intelligent agents for learning activities such as user interface agents, task agents, knowledge agents and mobile agents is developed. Using the multi-agent technology in E-Learning system, gives user interaction facility to both users and designers, and adds ability to exchange information between different objects in a flexible way.

Keywords: Intelligent learning systems, Agent technology, E-Learning system, Science park, Security, Learning management, Business continuity management, and Multimedia.

Title of the Paper: A Sampling-based Method for Dynamic Scheduling in Distributed Data Mining Environment


Authors: Jifang Li

Abstract: In this paper, we propose a new solution for dynamic task scheduling in distributed environment. The key issue for scheduling tasks is that we can not obtain the execution time of irregular computations in advance. For this reason, we propose a method which is based on sampling to some typical data mining algorithm. We argue that a function is existed in the items: execution time, the size of data and the algorithm, therefore we can deduce the execution time of a data mining task from the corresponding the size of data and algorithm. The experimental results show that almost all the algorithms exhibits quasi linear scalability, but the slope of different algorithms is different. We adopt this sampling method for process the tasks scheduling in distributed data mining environment. The experimental results also show the sampling method is applicable to task scheduling in dynamic environment and can be adopted to obtain a higher result.

Keywords: Sampling, Data Mining, Distributed Computing

Title of the Paper: New Discussion on Information Communication Model and Process Model in Software Organization


Authors: Xuejun Liu

Abstract: A lot of process models for software development such as waterfall model and so on are almost based on description of the relationship between the scale and complex of software project and technology tasks and methods in every phase of development as the main content. They are suitable for the technique works of software development, but ignore the connotation of the management work in software process. Especially that they ignore the connotation of the management relevant with concrete business and for process management it is not enough to be only with technical standard. In this article, we propose an information communication model of software process and a software process model including concrete business and describe the basic tasks of team software process and relationship between the layers of software process. Starting from this, it can guide the process operation of software team, provide a new solution for theoretical research and description of the software engineering subject and provide a framework concept for the auxiliary system tools for the research of software engineering.

Keywords: Process models, software development, waterfall model

Title of the Paper: Research and Application of SQLite Embedded Database Technology


Authors: Chunyue Bi

Abstract: The embedded database SQLite is widely applied in the data management of embedded environment such as mobile devices, industrial control, information appliance etc., it has become the focus of the development of related areas. For its advantages of stability and reliability, fast and high efficiency, portability and so on, which occupies the unique advantages among many of the main embedded databases. This paper first describes the definition, basic characteristics, structure and the key technologies of embedded database, analyses the features, architecture and the main interface functions of SQLite, gives a detailed porting process from SQLite to ARM-Linux platform, and discusses concrete application of SQLite in embedded system through a development case about the home gateway based on ARM-Linux.

Keywords: Embedded Database, SQLite, Porting, ARM-linux, Home Gateway

Title of the Paper: Design and Implementation Mobile Payment Based on Multi-Interface of Mobile Terminal


Authors: Zhong Wan, Weifeng Yin, Ronggao Sun

Abstract: Resulting in the security of mobile payment are the two main problems of the mobile terminal itself and the process of wireless communication. As the mobile terminal's own unique portability, small size and low cost nature,it cause that the mobile terminal weak ability of computing encryption. The emergence of mobile phone virus is also a potential security threat to mobile payment. At the same time, the openness characteristic of wireless communication decide to a radio channel can be easily tapped by illegal user. Illegal user can tamper with, delete the intercepted information or pretend legal user to access communication networks to communication. This article presents a mobile payment implement solution with distributed key based on the current mobile multi-interface. Analyzes the workflow of mobile payment and the reasons that cause the security in the process of mobile payment. Uses the additional equipment to achieve the secret keys distributed storage to improve the ability of test and verify in system and enhance the computing encryption capability of mobile terminal. At the same time, based on the J2ME security architecture using the encryption methods of data encryption, digital signature, identity authentication to ensure the security of wireless communication. Gives the hardware design and software design of multi-interface data encryption equipment. And gives the some processes design of mobile client’s software. This solution realizes the high security and low cost of mobile payment, has good applied value and marketable foreground.

Keywords: Payment Security, Mobile Payment, Data Encryption, Mobile Terminal, J2ME, Mobile interface

Title of the Paper: An Anycast Routing Algorithm Based on Genetic Algorithm


Authors: Chun Zhu, Min Jin

Abstract: Anycast refers to the transmission of data from a source node to (any) one member in the group of designed recipients in a network, which has been defined a standard communication model of IP version 6. In order to implement multi-destination and multi-path anycast routing on the heavy load network, this paper proposes a new anycast routing algorithm based on genetic algorithm and presents a heuristic genetic algorithmic to solve shortest path routing optimization problem. The network simulation experimental results show that this algorithm can capable of finding optimization anycast routing path and getting more resolution at less time and more balanced network load, enhancing the search ratio.

Keywords: Anycast Routing, Anycast Routing Algorithm, Genetic Algorithm, Service Model

Title of the Paper: A Distributed E-Business System Based on Conic Curve


Authors: Xinxia Song, Zhigang Chen

Abstract: A distributed E-Business System based on conic curve is proposed. This scheme is composed of two parts, constructing license and validating license. Because he security of license is determined by private key, not the arithmetic itself, user can not construct new license by given license and the public key as long as the private key is not leaked. Since encoding and decoding over conic are easily implement on conic curves, it has enabled our scheme to greatly enhance efficiency. We also analysis its security. The entire process guarantees the security and reliability.

Keywords: Conic curve, E-Business system, public-key cryptosystem, digital signature, proxy signature

Title of the Paper: The Study of Terrain Simulation based on Fractal


Authors: Deng Fang

Abstract: In this paper the principle of fractal theory and fractal Brownian motion (FBM) are analyzed in detail. The Diamond-Square algorithm is introduced. And it introduces the process of generating three dimensional terrain, cloud and sky by this algorithm. There also analyses the physical meaning of FBM’s two parameters H and σ . Through the analysis of the effect of different parameters, there is proposed a method of controllable fractal terrain based on parameters. Simulated a controlled fractal scene via VC ++ and OpenGL. And the same time, points the limitation of this algorithm. But as this algorithm’s generating speed is faster, it also has validity and practicability.

Keywords: Fractal, FBM, Diamond-Square, 3D terrain, OpenGL

Title of the Paper: Face Recognition based on Multi-scale Singular Value Features


Authors: Ran Jin, Zhuojun Dong

Abstract: Singular value vecto r of an image is a valid feature for identification. But the recognition rate is low when only one scale singular value vector is used for face recognition. An algorithm was developed to improve the recognition rate. Many subimages are obtained when the face image is divided in different scales, with all singular values of each subimage organized and used as an eigenvector of the face image. Faces are then verified by linear discriminant analysis (LDA) under these multiscale singular value vectors. These multiscale singular value vectors include all features of an image from local to the whole, so more discriminant information for pattern recognition is obtained. Experiments were made with ORL human face image databases. The experimental results show that the method is obviously superior to the corresponding algorithms with a recognition rate of 97. 38%.

Keywords: Multiscale, singular value decomposition (SVD) , feature combination, face recognition, Fisher, ORL, subimage

Title of the Paper: Robust Denoising of Point-Sampled Surfaces


Authors: Jifang Li, Renfang Wang

Abstract: Based on sampling likelihood and feature intensity, in this paper, a feature-preserving denoising algorithm for point-sampled surfaces is proposed. In terms of moving least squares surface, the sampling likelihood for each point on point-sampled surfaces is computed, which measures the probability that a 3D point is located on the sampled surface. Based on the normal tensor voting, the feature intensity of sample point is evaluated. By applying the modified bilateral filtering to each normal, and in combination with sampling likelihood and feature intensity, the filtered point-sampled surfaces are obtained. Experimental results demonstrate that the algorithm is robust, and can denoise the noise efficiently while preserving the surface features.

Keywords: Moving least squares surface, sampling likelihood, normal voting tensor, feature intensity, bilateral filtering, point-sampled surfaces denoising

Title of the Paper: An Adaptive Requirement Framework for SCUDWare Middleware in Ubiquitous Computing


Authors: Qing Wu, Danzhen Wang

Abstract: Due to high dynamic computing environments of ubiquitous computing, it poses many challenges for software middleware technologies. The component-based middleware systems should possess self-adjusting functions for adapt to internal and external environments variation. This paper firstly describes a middleware called SCUDWare for smart vehicle space in ubiquitous Computing. Then we propose a middleware requirement model including users’ and resources’ variable requirements. After that, a component dynamic behavior model is presented. Next, an adaptive requirement framework is given in detail, which can automatically tune middleware configuration parameters, and conduct a safe and dynamic component composition to preserve the middleware QoS requirements. Finally, it is prototyped and validated by using a mobile music program to analyze performance of this framework.

Keywords: Ubiquitous Computing, Adaptive middleware, Component-based technology

Title of the Paper: Blind Watermark Algorithm Based on HVS and RBF Neural Network in DWT Domain


Authors: Yanhong Zhang

Abstract: This paper proposes a new blind watermarking scheme based on discrete wavelet transform(DWT) domain. The method uses the HVS model, and radial basis function neural networks(RBF). RBF will be implemented while embedding and extracting watermark. The human visual system (HVS) model is used to determine the watermark insertion strength. The neural networks almost exactly recover the watermarking signals from the watermarked images after training and learning. The experimental results show that the watermark proposed in this paper is invisible ( the PSNR is higher than 41) and is robust in the case of against some normal at tacks such as JPEG compression , additive noise and filtering , etc.

Keywords: Blind digital watermarking, wavelet transform, RBF neural network, robustness

Title of the Paper: Research and Practice of Distributed Test Platform


Authors: Feng Qinqun, Yun Wenfang, Peng Sheqiang

Abstract: After concise introduction of Software test and its significance, an integration environment of Distributed Test Platform for software testing is proposed and its two departments, including Test Server and Test Driver, are described subsequently. The composition of Test Server based on Metadata Service is presented. The Metadata Service’s element and its runtime principle are described in details. Finally summarization is listed.

Keywords: Software Test, Distributed Test Platform, Test Driver , Test Server, Metadata Service

Issue 2, Volume 8, February 2009

Title of the Paper: A Fast Geometric Rectification of Remote Sensing Imagery Based on Feature Ground Control Point Database


Authors: Jian Yang, Zhongming Zhao

Abstract: This paper, on the basis of the traditional design of database for ground control point, tries to founded a fast auto-correction method of satellite remote sensing imagery based on feature ground control point database which brings local feature points as the effective supplement and aims to achieve the automatic matching between feature ground control points and original images that need geometric correction and improve the rectified process utilizing random sample consensus (RANSAC algorithm). In this method, the author realize the auto-extraction of feature ground control points for ensuring speed and precise geometric correction of a high volume of satellite remote sensing images by means of analyzing feature ground control point database algorithm.

Keywords: Ground Control Point (GCP) Database, feature matching, local invariance, feature ground control points

Title of the Paper: Mitigation of the Effects of Selfish and Malicious Nodes in Ad-hoc Networks


Authors: Houssein Hallani, Seyed A. Shahrestani

Abstract: A wireless Ad-hoc network is a group of wireless devices that communicate with each other without utilising any central management infrastructure. The operation of Ad-hoc networks depends on the cooperation among nodes to provide connectivity and communication routes. However, such an ideal situation may not always be achievable in practice. Some nodes may behave maliciously, resulting in degradation of the performance of the network or even disruption of its operation altogether. To mitigate the effect of such nodes and to achieve higher levels of security and reliability, this paper expands on relevant fuzzy logic concepts to propose an approach to establish quantifiable trust levels between the nodes of Ad-hoc networks. These trust levels are then used in the routing decision making process. Using OPNET and MATLAB simulators, the proposed approach is validated and further studied. The findings show that when the proposed approach is utilised, the overall performance of the Ad-hoc network is significantly improved.

Keywords: Ad-hoc networks, Behavior analysis, Malicious attacks, Simulation, Throughput

Title of the Paper: Modal Learning Neural Networks


Authors: Dominic Palmer-Brown, Sin Wee Lee, Chrisina Draganova, Miao Kang

Abstract: This paper will explore the integration of learning modes into a single neural network structure in which layers of neurons or individual neurons adopt different modes. There are several reasons to explore modal learning. One motivation is to overcome the inherent limitations of any given mode (for example some modes memorise specific features, others average across features, and both approaches may be relevant according to the circumstances); another is inspiration from neuroscience, cognitive science and human learning, where it is impossible to build a serious model without consideration of multiple modes; and a third reason is non-stationary input data, or time-variant learning objectives, where the required mode is a function of time. Two modal learning ideas are presented: The Snap-Drift Neural Network (SDNN) which toggles its learning between two modes, is incorporated into an on-line system to provide carefully targeted guidance and feedback to students; and an adaptive function neural network (ADFUNN), in which adaptation applies simultaneously to both the weights and the individual neuron activation functions. The combination of the two modal learning methods, in the form of Snap-drift ADaptive FUnction Neural Network (SADFUNN) is then applied to optical and pen-based recognition of handwritten digits with results that demonstrate the effectiveness of the approach.

Keywords: Modal Learning, Snap-drift, ADFUNN, SADFUNN, e-learning, Personalized Learning, Diagnostic Feedback, Multiple Choice Questions

Title of the Paper: An Improved BP Neural Network for Wastewater Bacteria Recognition Based on Microscopic Image Analysis


Authors: Li Xiaojuan, Chen Cunshe

Abstract: The microscopic images of wastewater bacteria are analysed, and a scheme for classification and recognition for wastewater bacteria based on microscopic images analysis are put forward in the paper. An adaptive and enhanced edge detection solution for the images of wastewater bacteria is proposed, which can effectively remove noises in the images and get clear edges of microscopic image by optimizing segmentation threshold and the varied order of edge detection. Seven contour invariant moment features and four morphological features are extracted by analysis of microscopic images of wastewater bacteria in which six features are chosen by PCA in order to reduce the dimensionality of the features extracted from the images. A self-adaptive accelerated BP algorithm is developed for training the classification of bacteria microscopic images. The proposed method is tested with CECC database and the results show that the presented image recognition solution is effective and can greatly improve the speed and consistency in performing large-scale surveys or rapid determination of bacterial abundance, morphology.

Keywords: BP Neural Network, Edge Detection ,Wastewater Bacteria, Contour Invariant moment

Title of the Paper: Algorithm for Optimal Dimensioning of Three Phase and Mono Phase Electric Power Lines Implemented in Java


Authors: Cristian Abrudean, Manuela Panoiu

Abstract: In this paper it was present a Java software package useful for dimensioning of low voltage lines mono phase and three phases (AC and DC). For electric grid modelling it was use tree type. This package software allows the estimation and calculation of parameters of power transmission lines of electric power. The software have a graphical user interface, so the user has the possibility to input of data and other characteristics of the electric line, passing to the calculation stage when the data are correctly and completely entered. It was use a recursive algorithm in order to calculate the total active and reactive currents and the admissible loss voltage based on the users (consumers) characteristics. The output of the results is shown on the screen.

Keywords: Computer software, electrical grid, optimal dimensioning, Java

Title of the Paper: Visual Interactive Environment for doing Geometrical Constructions


Authors: Anca Iordan, George Savii, Manuela Panoiu, Caius Panoiu

Abstract: In this work will be presented the elaboration of an educational informatics system for doing geometry on a computer. In a way it replaces pencil, paper, ruler and compass with equivalent computer tools. The achieved informatics system will be able to be used for teachning Euclidean geometry, both in pre-university and in university education.

Keywords: Dynamical software, Euclidean geometry, Java, distance education

Title of the Paper: UML4ODP: OCL 2.0 Constraints Specification & UML Modeling of Interfaces in the Computational Metamodel


Authors: Oussama Reda, Bouabid El Ouahidi, Daniel Bourget

Abstract: The purpose of this work is analysis of computational language concepts and introduction of novel pertinent ones in order to provide a new computational metamodel of interaction signatures in UML4ODP FDIS. we mainly introduce the concept of Functional computational interface which unify signal and operation interfaces notions. The unification of signal and operation interactions concepts is presented by introducing the Parameterized interaction concept. We show that parameterized interactions are of two main kinds; namely, primitives and compounds. we also introduce the notion of incoming and outgoing primitives. As an application of our modeling choices we redefine interaction, refinement and type checking rules in a concise manner, and then specify them using the useful specification functionalities of OCL 2.0, showing how novel definitions as well as their specification are easy to read, write and understand.

Keywords: RM-ODP, UML4ODP, Computational language, Meta-modeling, Computational interface, Interaction Signature, type checking, Interaction refinements

Title of the Paper: Towards a Study Opportunities Recommender System in Ontological Principles-based on Semantic Web Environment


Authors: Ana Maria Borges, Richard Gil, Marla Corniel, Leonardo Contreras, Rafael H. Borges

Abstract: A professional career selection is a complicated process for university student candidates and often little technical tolls are available for who aim to enter to the superior education system, since it is necessary to consider the incidence of a multiplicity of variables to obtain a “satisfactory answer” that comes near to the idea that they have preconceived. These variables build up a complex relations map that requires the formulation of an exhaustive and rigorous conceptual scheme.
In this research a domain ontological model is presented as support to the student’s decision making for opportunities of University studies level of the Venezuelan education system. For the declaration of the domain ontological model developed, the information provided by two organisms (OPSU & CNU) is used. Both are responsible to design the policies and strategies for the superior education in Venezuela. The ontology is designed and created using Methontology approach, since this methodology offers the possibility of improving the progressive creation or captures and knowledge articulation, its elements and relations. In order to represent the ontology Protege 3.1.1 tool is used, based on the ontological language for the Web: OWL (Web Ontology Language) and; finally, to accede and to visualize the ontology, an application based on the Semantic Web is developed.
In the field of the computation, computer science and systems, the concept of profiles or users models has many meanings and connotations, its first and greater diffusion arise with the development and use of on-line increasing systems; however, they have gain great preponderance recently, not only to refer and/or to respond to the passive user who approaches “the system”, but for the possibility of taking care of a user stimulated and attracted by “the system”. In that sense, the possibility of this “personalized attention” and/or recommendations from “the system” will be obviously conditioned by the user context.
Centered on the possibility of giving that “customized attention” from which they must select an option between many, it is tried in the future to construct a meta-ontology that integrates the domain ontology with a user profile ontology, thus aiming towards to a Semantic Recommendation System under Multiagent approach in the future, that could be useful to support students decision making process for career selection over different study opportunities in Higher Education level in Venezuela.

Keywords: Study Opportunities, Ontology, Semantic Web, Ontology Development, User Profiles, User Context, Intelligent Agents

Title of the Paper: Tactile Fabric Comfort Prediction Using Regression Analysis


Authors: Les M. Sztandera

Abstract: In this paper we explore complex relationships between mechanical and sensory properties of fabrics, and the perceived tactile comfort. Mechanical properties, measured objectively by Kawabata Evaluation System for Fabrics (KES-FB), and handfeel properties, measured subjectively by sensory expert panel, are related to the tactile comfort of fabrics using statistical regression approaches. A universe of 48 fabrics is examined to analyze and map the relations. The initial 17 mechanical and 17 handfeel parameter sets were reduced to 4 and 5 properties, respectively. Adjusted R2 values were 0.657 for mechanical and 0.863 for handfeel parameters, reflecting sound goodness-of-fit measures, and providing reasonable ways for prediction of tactile fabric comfort from mechanical and handfeel parameters.

Keywords: Tactile perception, fabric mechanical properties, regression analysis, textile property analysis

Title of the Paper: Identification of the Most Significant Properties Influencing Tactile Fabric Comfort Using Regression Analysis


Authors: Les M. Sztandera

Abstract: Engineered fabrics are being used increasingly in commercial and domain-specific systems. Such fabrics with specified consumer-desired characteristics can be computationally designed. Through the use of an extensive database that correlates sensory and mechanical properties with tactile comfort assessments, desired comfort can be predicted by measuring a limited number of properties. In this paper we are focusing on the most significant sensory and mechanical properties influencing tactile fabric comfort. Output systems can be optimized to exhibit the highest level of comfort by engineering a fabric with specific sensory and mechanical properties. This paper examines stepwise regression analysis and identifies the most significant properties influencing tactile fabric comfort. The reported Beta coefficients are the standardized regression coefficients. Their absolute magnitudes reflect their relative importance in predicting comfort values. A universe of 48 fabrics is examined to analyze and map the relations. The initial 17 mechanical and 17 sensory parameter sets are reduced to sets of 1 to 4 and 1 to 5 properties, respectively. Adjusted R2 values were 0.360 to 0.657 for mechanical and 0.713 to 0.863 for sensory parameters, reflecting sound goodness-of-fit measures, and providing reasonable ways for identifying the mechanical and sensory properties that are most significant influences on tactile fabric comfort. Elongation and hysteresis of shear force were found to be the most influential mechanical properties, while compression resilience rate and graininess were found to be the sensory properties that most impacted comfort.

Keywords: Tactile perception, fabric mechanical properties, stepwise regression analysis, standardized importance factors, textile property analysis

Title of the Paper: Web-based Multiuser Interior Design with Virtual Reality Technology


Authors: Yen-Chun Lin, Chen-Chuan Pan

Abstract: A Web-based interior design application system, named Multiuser Interior Design (MID), is presented. Technologies from virtual reality (VR), the Web, and database are used to implement MID. It allows users to operate in the VR fashion, interacting with 3D virtual objects and spaces through a Web browser. Users can view and elaborate designs, examine furniture, and even place purchase orders of furniture. MID contains various features, including multiuser collaboration and concurrent display of multiple views of a scene. MID can be modified for other e-commerce purposes; many of the functions serve as a reusable framework.

Keywords: E-commerce, Interior design, Internet, Multiuser collaboration, Reusable framework, Three-tier architecture, Virtual reality, Web-based

Title of the Paper: Digital Game-Based Learning (DGBL) Model and Development Methodology for Teaching History


Authors: Nor Azan Mat Zin, Wong Seng Yue, Azizah Jaafar

Abstract: History subject plays a vital role in instilling the spirits of patriotism among the students; to nourish and instill love and loyalty to one’s country and to be a true citizen. The lack of creativity in history teaching caused process effectively and interestingly especially among young learners. Digital Game Based Learning (DGBL) approach utilizes the game as a medium for conveying learning contents. There are many models for educational games development which combined the Instructional Design (ID) and game development process. However, there is still a conflict on how to merge ID and game development effectively. Therefore, we looked into the pedagogy and game design aspects. We then propose a DGBL model for History educational games design and a development methodology which combined the ID and game development process, named DGBL-ID model. The DGBL-ID model consists of five phases: analysis, design, development, quality assurance as well as implementation and evaluation.

Keywords: Serious games, DGBL, Instructional Design (ID), History educational game, video games

Title of the Paper: Adaptive Multi-Constraints in Hardware-Software Partitioning for Embedded Multiprocessor FPGA Systems


Authors: Trong-Yen Lee, Yang-Hsin Fan, Chia-Chun Tsai

Abstract: An embedded multiprocessor field programmable gate array (FPGA) system has a powerful and flexible architecture that the interaction between hardware circuits and software applications. Modern electronic products, such as portable devices, consumer electronics and telematics, can be evaluated rapidly in this platform via the implementation of a set of hardware and software tasks. However, the functionality is markedly increased, resulting in a significant raise in the number of hardware and software tasks. Consequently, too large of a solution space is formed to achieve hardware-software partitioning. Moreover, a partitioning result with low power consumption and fast execution time is difficult to obtain since meeting simultaneously multi-constraints from hundreds of thousands of combinations of hardware-software partitions is difficult. Thus, this work presents a hardware-software partitioning scheme that can obtain a partitioning result that satisfies multi-constraints from massive solution space. Specifically, this study attains a partitioning result with low power consumption and fast execution time. The effectiveness of the proposed approach is demonstrated by assessing a JPEG encoding system and a benchmark with 199 tasks.

Keywords: Adaptive multi-constraints partitioning, hardware-software partitioning, embedded multiprocessor system, FPGA system, hardware-software codesign

Title of the Paper: Intelligent Entry Control


Authors: Erik Dovgan, Matjaz Gams

Abstract: Entry control is an important issue for security and optimization reasons. Input sensors based on biometrics and intelligent methods that learn from experience are used to recognize terrorists or simply to detect an unusual behavior of regular stuff. We have designed and developed an intelligent entry system consisting of four independent modules: expert system, micro learning, macro learning and camera. In the experimental set-up, there are four input sensors: a door sensor, an identity card reader, a fingerprint reader and a camera. Each of the four modules produces an explanation that categorizes an event as alarm or normal, and the system proposes the final suggestion with explanation.

Keywords: Ambient intelligence, access control, event classification

Title of the Paper: Activate the Dynamic Delegation Process in X.509 Certification via a New Extension


Authors: Moutasem Shafa'amry, Nisreen Alam Aldeen

Abstract: The increasing number of clients and users of e-banking, e-government and e-application through digital communications, made it a must to develop new methods and solutions for authentications and secure access. Digital certificates are one of these methods for secure transactions, X.509 certificate is one standard for these digital certificates, Despite the fact that x.509 certificate is of high level of security and authenticity, it has many weaknesses as not applying dynamic delegation to it.. The efficiency of proxy certificate which proposed as a practical solution in the field of dynamic delegation could not find solutions for its weaknesses Which were the main motivation to work on this research trying to come up with new solution which integrates the pros of X.509 and the pros of Proxy certificate to benefit from the specifications of each one and to avoid the weaknesses of them. this paper will cover the standard of Digital certificates and its relation with dynamic delegation, focusing on the weaknesses of applying these standards to dynamic delegation then we propose our solution to make it applicable and more efficient to apply dynamic delegation to digital certificate standards. Finally we will cover the pros and cons of our new solution, some conclusions and future work.

Keywords: Digital certification systems, X.509 standard, Proxy Certificates, authentication (PCA), Dynamic delegation, Identity verification, Network security, e-security

Title of the Paper: Using Palette and Minimum Spanning Tree for True-Color Image Steganography


Authors: Show-Wei Chien, Yung-Fu Chen, Pei-Wei Yen, Hsuan-Hung Lin

Abstract: Steganography is an application of data hiding which can attain camouflage and increase security by embedding the secret message into digital media for sending to the receiver without leaking to the third party. Several proposed methods which construct stegoimage by embedding the secret message into the colour palette. The person who receives the stegoimage can extract the secret message from the palette obtained by the received image. This method, however, greatly degrades the quality of the stegoimage and tends to arouse the intention of the intruders. In this paper, we propose a method for constructing stegoimages with high quality which greatly eliminate the above problem. The advantage of the proposed method is that the image quality is highly improved by accompanying with improvement of security and camouflage of the secret message. In order to avoid the intruder to attack the generated palette when transmission, the sender does not need to send the palette to the receiver directly, but instead asking the receiver to own a copy of the original secret image or obtain one when needed for extracting the secret message. Fortunately, there are several image pools which contain a lot of images allowing to be accessed by the public. The experimental results show that our method outperforms EZ stego and the methods proposed by Fridrich and Wu et al.

Keywords: Steganography, Palette optimization, Minimum Spanning Tree

Title of the Paper: A Novel Field-Source Reverse Transform for Image Structure Representation and Analysis


Authors: X. D. Zhuang, N. E. Mastorakis

Abstract: The image source-reverse transform is proposed for image structure representation and analysis, which is based on an electro-static analogy. In the proposed transform, the image is taken as the potential field and the virtual source of the image is reversed imitating the Gauss’s law. Region border detection is effectively implemented based on the virtual source representation of the image structure. Moreover, the energy concentration property of the proposed transform is investigated for promising application in lossy image compression. Experimental results indicate that the proposed source-reverse transform can achieve efficient representation of image structure, and has promising application in image processing tasks.

Keywords: Source-reverse transform, electro-static field, region border detection, lossy image compression

Title of the Paper: Identifying Crosscutting Concerns using Partitional Clustering


Authors: Gabriela Czibula, Grigoreta Sofia Cojocar, Istvan Gergely Czibula

Abstract: Aspect mining is a research direction that tries to identify crosscutting concerns in already developed software systems, without using aspect oriented programming. The goal is to identify them and then to refactor them to aspects, to achieve a system that can be easily understood, maintained and modified. In this paper we aim at presenting a partitional clustering algorithm for identifying crosscutting concerns in existing software systems. We experimentally evaluate our algorithm using the open source JHotDraw case study, for three distance functions, providing a comparison of the proposed approach with similar existing approaches. The experimental evaluation conducts us to the best semi-metric distance function to be used in the clustering process.

Keywords: Aspect mining, crosscutting concerns, clustering

Title of the Paper: FF-Based Feature Selection for Improved Classification of Medical Data


Authors: Yan Wang, Lizhuang Ma

Abstract: In processing the medical data, choosing the optimal subset of features is important, not only to reduce the processing cost but also to improve the classification performance of the model built from the selected data. Rough Set method has been recognized to be one of the powerful tools in the medical feature selection. However, the high storage space and the time-consuming computation restrict its application. In this paper, we propose two new concepts: discernibility string and feature forest, and an efficient algorithm, the Feature Forest Based (FF-Based) algorithm, for generation of all reducts of a medical dataset. The algorithm consists of two phases: feature forest construction phase and disjunctive normal form computation phase. In the first phase, the discernibility strings that are the concatenation of some of features between two different cases construct the feature forest. In the second phase, the disjunctive normal form is computed to reduct features based on feature forest. The experimental results on the medical datasets of UCI machine learning repository and a real liver cirrhosis dataset show that the algorithms of this paper can efficiently reduce storage cost and improve the classification performance.

Keywords: Feature selection, rough set, disjunctive normal form, feature forest, discernibility string

Title of the Paper: Analysis of System Bus Transaction Vulnerability based on FMEA Methodology in SystemC TLM Design Platform


Authors: Yung-Yuan Chen, Chung-Hsien Hsu, Kuen-Long Leu

Abstract: Intelligent safety-critical systems, such as intelligent automotive systems or intelligent robots, require a stringent reliability while the systems are in operation. As system-on-chip (SoC) becomes prevalent in the intelligent system applications, the reliability issue of SoC is getting more attention in the design industry while the SoC fabrication enters the very deep submicron technology. The system bus, such as AMBA AHB, provides an integrated platform for IP-based SoC. Apparently, the robustness of system bus plays an important role in the SoC reliability. In this study, we propose a useful bus system vulnerability model and present a thorough analysis of system bus vulnerability in SystemC transaction-level modeling (TLM) design level by injecting faults into the bus signals, which can assist us in predicting the robustness of the system bus, in locating the weaknesses of the bus system, and in understanding the effect of bus faults on system behavior during the SoC design phase. The impact of benchmarks on system bus vulnerability is also addressed. The contribution of this work is to promote the dependability verification to TLM abstraction level that can significantly enhance the simulation performance, and provide the comprehensive results to validate the system bus dependability in early design phase for safety-critical applications.

Keywords: Fault injection, reliability, SystemC, system bus dependability, system-on-chip (SoC)

Issue 3, Volume 8, March 2009

Title of the Paper: Implementing Lightweight Reservation Protocol for Mobile Network Using Hybrid Schema


Authors: Abdullah Gani, Lina Yang, Nor Badrul Anuar, Omar Zakaria, Ros Surya Taher

Abstract: This paper presents our method to improve lightweight reservation protocol. This was inspired by the ever increasing volume of multimedia traffic over the Internet which demanding quality of service beyond the traditional best-effort. The Integrated Services model relies on the Resource reservation Protocol (RSVP) for signaling and reserving resources. RSVP uses the receiver-initiated reservation mechanism to set up the reservation which executes protocol complexity and incurs additional processing and storage overheads on the routers. Due to heavyweight characteristic, many researchers changed the focus to the lightweight reservation protocol. In this paper, we propose a lightweight signaling protocol of Sender-initiated and Mobility-support Reservation Protocol (SMRP) with Crossover Router (COR) as an extension to SMRP. COR scheme cannot provide smooth handover as it affects the SMRP in Mobile hosts. This is the main disadvantage of COR scheme. Pointer Forwarding Scheme makes an advance resource reservation only a forwarding one-step path from the sender along the forwarding chains. In order to make SMRP more suitable for Mobile hosts, we propose a hybrid method combining the advantages of COR scheme with Pointer Forwarding scenarios. We use ns2 Java version network simulator to test it. We evaluate the performance of SMRP in a mobile network environment. The results show that the hybrid scheme can support seamless and also efficient SMRP path rerouting during handoff in respect of decreased the drop probability.

Keywords: QoS, Wireless Network, RSVP, SMRP, Hybrid Schemes, Resource Reservation

Title of the Paper: A Method for Land Consolidation Progress Assessment Based on GPS and PDA


Authors: Guangming Zhu, Yingyi Chen, Daoliang Li

Abstract: Field survey is a traditional method in the process of land consolidation. But it has such as low efficiency, long time, and some other shortcomings in measuring length, area and calculating the amount of small object, especially for the project area with more complex terrain. The paper puts forward using GPS/PDA to calculate progress for project of land consolidation, which propels the application of GPS/PDA in the fields of land consolidation. The paper introduces the principle of GPS, interface of GPS and PDA, realize real time interaction between the field of project area and the electronic maps by using GPS’s location function, on the basis of which accomplishes real time judgment for project land features’ progress, quantities and some other accuracy of information.

Keywords: GPS, PDA, land consolidation, progress

Title of the Paper: A Spatial Decision Support System for Land-use Structure Optimization


Authors: Xiaoli Li, Yingyi Chen, Daoliang Li

Abstract: This article describes a decision support system for land-use structure optimization and land-use allocation. This system was established for the rural land managers to explore their land use options. It integrated database technology, expert system technology and spatial decision support system technology. The DSS consist of four components: a geographic information system (GIS), land use modules, a graphical user interface and land use planning tools. The linear programming, fuzzy clustering, and other land-use structure optimization algorithms are implemented on the ArcEngine software platform. The system has been applied in Beijing Pinggu area. The results suggest that this system can be a useful tool to support management decisions.

Keywords: Land-use Structure Optimization, Land use allocation, ComGIS, DSS

Title of the Paper: Application of Image Texture Analysis to Improve Land Cover Classification


Authors: Xiaochen Zou, Daoliang Li

Abstract: Image texture analysis has received a considerable amount of attention over the last few years as it has played an important role in the classification of the remote sensing images. This paper provides an overview of several different approaches to image texture analysis and demonstrates their use on the problem of land cover classification. We used grey level co-occurrence matrix (GLCM) method to assistant the land cover classification and then compared and evaluated all of the result of classifications. In the experimentation, by comparing the classification result of contrast, energy and entropy we find out that the preferable texture features of grey level co- occurrence matrices method was contrast. In this thesis, it used the feature images helping the classification of remote sensing and obtained good result. And it also used C++ programming language to write a programme to compute the number of the feature of texture.

Keywords: Land cover classification, Texture analysis, Grey level co-occurrence matrices method, Texture feature

Title of the Paper: A Knowledge Management Practice Investigation in Romanian Software Development Organizations


Authors: Iuliana Scorta

Abstract: Knowledge plays an increasingly larger role in organizations and many consider it the most important factor of production in a knowledge economy. Knowledge is dynamic and evolves with technology, organizational culture and the changing needs of organization’s software development practices. In this paper I frame my research by discussing the importance of knowledge management in software engineering. After presenting the landscape inspiring the study research questions I conduct an investigation for evaluating aspects of knowledge management practice in Romanian software engineering industry and discuss the major findings.

Keywords: Tacit knowledge, Explicit knowledge, Knowledge management, Romanian software engineering

Title of the Paper: Quality Control and ISO Quality Compliance in the Product Lifecycle Management at Siemens


Authors: Clotilde Rohleder

Abstract: From our experience with customers who deploy customized software products, we have learned that deriving products from shared software assets requires more than complying with quality standards like ISO9126. Additionally, developers must consider what we call the quality profile of the final product. A process that matches the quality profile of final product during product derivation helps provide and validate industrial software application solutions. This paper describes this matching concept and its application in a case study of the development of a product lifecycle reporting tool at a large organization. Also we propose a tool to improve compliance with ISO Quality Standards and that could enhance quality control in derivation process.

Keywords: Quality Product Derivation, Requirements Engineering, Non-Functional Requirements, ISO Quality Standards, Quality Compliance

Title of the Paper: A New Iterative Approach for Dominant Points Extraction in Planar Curves


Authors: Cecilia Di Ruberto, Andrea Morgera

Abstract: In this paper the problem of dominant point detection on digital curves is addressed. Based on an initial set of curvature points, our approach adds iteratively significant points by looking for the higher curvature contour points. The process continues until all the sums of the distances of contour points in the arcs subtended to the chord between two next dominant points is less then a predefined threshold. A final refinement process adjusts the position of located dominant points by a minimum integral square error criterion. We test our method by comparing its performance with other well known dominant point extraction techniques successfully. In the last section some examples of polygonal approximation are shown.

Keywords: Curvature, Digital Curve, Dominant Points, Polygonal Approximation

Title of the Paper: 'No Representation without Information Flow' - Measuring Efficacy and Efficiency of Representation: An Information Theoretic Approach


Authors: Junkang Feng, Yang Wang

Abstract: Representation is a key concept for semiotics and for information systems. Stamper’s framework may be seen outlining what is required for the efficacy of signs in standing for (i.e., representing) something else in an organization, among many others. We explore how the efficacy and the efficiency of representation may be measured, which seems overlooked in available literature of information systems and organizational semiotics. And we approach this problem from the perspective of what we call the ‘information carrying’ relation between the representation and the represented. We model the represented as an information source and the representation an information carrier, and the ‘representing’ relationship between them as that of ‘information carrying’. That is, information is carried and therefore flows. These then are further modeled mathematically as random variables and random events, and a special relationship between random events. This approach enables us to reveal a necessary condition for the efficacy and the efficiency of representation, and to measure it. To this end we extend Dretske’s semantic theory of information. The conviction that we put forward here is ‘No representation without information flow’, based upon which the efficacy and efficiency of a representation system may be measurable.

Keywords: Representation, Information systems, Information theory, Information content, Database design

Title of the Paper: Nighttime Vehicle Light Detection on a Moving Vehicle using Image Segmentation and Analysis Techniques


Authors: Yen-Lin Chen

Abstract: This study proposes a vehicle detection system for identifying the vehicles by locating their headlights and rear-lights in the nighttime road environment. The proposed system comprises of two stages for detecting the vehicles in front of the camera-assisted car. The first stage is a fast automatic multilevel thresholding, which separates the bright objects from the grabbed nighttime road scene images. This proposed automatic multilevel thresholding approach provide the robustness and adaptability for the system to operate on various illuminated conditions at night. Then the extracted bright objects are processed by the second stage – the proposed knowledge-based connected-component analysis procedure, to identify the vehicles by locating their vehicle lights, and estimate the distance between the camera-assisted car and the detected vehicles. Experimental results demonstrate the feasibility and effectiveness of the proposed approach on vehicle detection at night.

Keywords: Computer vision, vehicle detection, image segmentation, image analysis, multilevel thresholding, autonomous vehicles

Title of the Paper: Triplet-Based Topology for On-Chip Networks


Authors: Wang Zuo, Zuo Qi, Li Jiaxin

Abstract: Most CMPs use on-chip network to connect cores and tend to integrate more simple cores on a single die. As the number of cores increases, on-chip network will play an important role in the performance of future CMPs. Due to the tradeoff between the performance and area constraint in on-chip network designs, we propose the use of triplet-based topology in on-chip interconnection networks and demonstrate how a 9-node triplet-based topology can be mapped to on-chip network. By using group-caching protocol to exploit traffic locality, triplet-based topology achieve lower latency and energy consumption than 2D-MESH. We run multithreaded commercial benchmarks on multi-core simulator GEMS to generate practical traffics and simulate these traffics on network simulator Garnet. Our experiment results show that triplet-based network can increase the work-related throughput by 3%~11% and reduce average network latency by 24%~32% compared with 2D-MESH, with the router energy consumption reduced by 13%~16% and the link energy consumption reduced by 14%~16%.

Keywords: On-chip network, Cache protocol, Network latency, Energy consumption, Performance, Mapping

Title of the Paper: Neural Architectures Optimization and Genetic Algorithms


Authors: Mohamed Ettaouil, Youssef Ghanou

Abstract: The artificial neural networks (ANN) have proven their efficiency in several applications: pattern recognition, voice and classification problems. The training stage is very important in the ANN’s performance. The selection of the architecture of a neural network suitable to solve a given problem is one of the most important aspects of neural network research. The choice of the hidden layers number and the values of weights has a large impact on the convergence of the training algorithm. In this paper we propose a mathematical formulation in order to determine the optimal number of hidden layers and good values of weights. To solve this problem, we use genetic algorithms. The numerical results assess the effectiveness of the theorical results shown in this paper and computational experiments are presented, and the advantages of the new modelling.

Keywords: Artificial neural networks (ANN), Non-linear optimization, Genetic algorithms, Supervised Training, Feed forward neural network

Title of the Paper: Performance Analysis of Mobile IP Registration Protocols


Authors: Rathi S, Thanushkodi K

Abstract: Mobile IPv6 will be an integral part of the next generation Internet protocol. The importance of mobility in the Internet gets keep on increasing. Current specification of Mobile IPv6 does not provide proper support for security in the mobile network and there are other problems associated with it. This paper is concerned with security aspects of the Registration protocols in Mobile IP. Providing security in Mobile IP registration is highly important. The registration part must be guarded against any malicious attacks that might try to take illegitimate advantage from any participating principal. This paper deals with the performance analysis of various protocols available for Mobile IP registration. The parameters considered for comparison are: Data confidentiality, Authentication, Attack prevention, Registration delay and Computational complexity. This paper aims to determine the protocol which outperforms others when the parameters mentioned above are taken into consideration.

Keywords: Mobile IP, Mobility Agents, Confidentiality, Authentication, Attack prevention and Registration delay

Title of the Paper: Architecture for Address Auto-Configuration in MANET based on Extended Prime Number Address Allocation (EPNA)


Authors: Harish Kumar, R. K. Singla

Abstract: An efficient technique for IP address auto-configuration for nodes is an important component in the MANET setup. Previously proposed approaches either need a broadcast over complete network and / or need a duplicate address detection mechanism leading to an inefficient latency and overhead in assignment of addresses. In this paper, an Extended Prime Number Address Allocation (EPNA) technique has been proposed that is conflict free and distributed technique. EPNA does not require broadcast and address assignment can be completed with in maximum of four hop communications. In this technique a large number of nodes are empowered to act as proxies to assign the addresses and most of the time it is one-hop communication. Analytical results show that EPNA is capable of auto-configuring any node without much latency and overhead irrespective of address space. The proposed algorithm also gives even IP address distribution and low complexity for small as well large sized MANET.

Keywords: Auto-configuration architecture, Duplicate address detection, MANET, IP address, EPNA

Title of the Paper: Association Rules Mining Including Weak-Support Modes Using Novel Measures


Authors: Jian Hu, Xiang-Yang Li

Abstract: In association rules mining application, some rules can provide a lot of useful knowledge for us, though these rules have the lower Support, called weak-support mode in this paper. However, in existing Support-Confidence framework, the rules with lower Support will be lost. Thus, this paper puts forward a new association rules mining technique, which sets up the lower support threshold value to ensure the weak-support rules to be mined and applies Csupport measure to recognise weak-support mode. Then, a new measure, called as N-confidence, is used to restrict mining size in generation frequent sets, which can strain away the weaksupport rules without correlation. Furthermore, this paper puts forward a new interesting measure to distinguish from the association rules interesting degree. In order to enhance mining efficiency, a novel algorithm, namely FT-Miner, is presented to discover association rules in a forest by using two new data structures, including UFP-Tree and FP-Forest. The experimentation shows that the algorithm not only mines useful and weaksupport rules, but also has better capability than classical association rules mining algorithms.

Keywords: Data mining, Association rules, Weak-support mode, UFP-Tree, FP-Forest

Title of the Paper: Automatic Discovery of Data Resources in the E-Government Grid


Authors: Xianhu Meng, Yan Wang, Wenyu Zhang, Jinqi Meng

Abstract: Nowadays, there is a growing number of e-government portals and solutions that provide integrated governmental data resources to the customers (citizens, enterprises or other public sectors). However, the administration of distributed data resources are faced with increasing challenges caused by the discovery difficulties across the internet. To overcome this, this paper puts forward the concept of a model for automatic data resource discovery in the data grid environment for e-government applications. The paper elaborates the rule of global naming, brings forward metadata’s registration and storage on the naming rule, and explores how to use it to find out the long-distance data resources rapidly. The paper also sets forth the method for restraining, sending and accessing data resources repeatedly by searching “comparison table” and using the time marker under the multiple data grid nodes, and points out that “comparison table” should adopt the strategy of prefering frequently used data and clearing rarely used data.

Keywords: Data grid, E-government; Metadata; Global naming; Data discovery

Issue 4, Volume 8, April 2009

Title of the Paper: Analysis of Alzheimer's Disease Progression in Structural Magnetic Resonance Images


Authors: B. S. Mahanand, M. Aswatha Kumar

Abstract: Structural magnetic resonance imaging (MRI) of the brain is an increasingly useful tool in the study of neurodegenerative diseases. MRI is currently the fastest developing medical imaging modality which is applied to an increasingly number of different medical diagnostic situations. Serial acquisition structural images of a subject’s brain are acquired over time offers opportunities to monitor the progression of tissue volume changes in fine detail at all anatomical locations. As a result, the analysis of structural MRI data has been an active area of image analysis research for many years especially in early diagnosis, tracking of disease progression, which makes it possible to investigate how, for instance, a patient responds to treatment. The aim of the paper is to investigate and analyze brain tissue changes in Alzheimer’s disease using nonrigid medical image registration and statistical analysis techniques. In our proposed approach, first the source and the target images are affinely registered to correct global differences between the images and then non-rigid registration based on free-form deformation using B-spline approximation is performed on the images. The resulting displacement values from the non-rigid registration are further investigated for deformation in the entire brain to detect typical deformation patterns. Finally, statistical method namely t-test is performed for analysis of the results. The initial results indicate that the tissue volume change in the brain occurs predominantly in the hippocampus of the brain.

Keywords: Magnetic resonance imaging, Image registration, Free-form deformation, Alzheimer’s disease, Statistical analysis, T-test

Title of the Paper: Public Institutions’ Investments with Data Mining Techniques


Authors: Adela Bara, Ion Lungu, Simona Vasilica Oprea

Abstract: Developing Decision Support Systems in public institutions such as the National Power Grid Companies require applying very efficient methods in order to support the decisions. The decision support system in the National Power Grid Companies can integrate the energy prediction achieved by some data mining algorithms that help managers to fundament their investment decisions in order to justify the financial feasibility. This is done by estimating all the benefits and costs during the life cycle. In order to estimate the revenues, we need to know with certain accuracy the output of these power plants. Due to the fact that the wind speed significantly fluctuates even during a day at the same location, the wind power output is difficult to be forecasted by statistical methods. In this paper we apply the data mining techniques on the available measured weather data in order to predict the wind power output and determine the financial feasibility of investment.

Keywords: Data Mining (DM), Decision support systems (DSS), Wind Power Plant (WPP), wind power forecast, measured weather parameters

Title of the Paper: Improving Performance in Integrated DSS with Object Oriented Modeling


Authors: Adela Bara, Vlad Diaconita, Ion Lungu, Manole Velicanu

Abstract: The development cycle of a decision support system involves many resources, time and high cost and above all, the database schema used in the system is built only for some specific tasks. So, a relational database schema or a data warehouse prototype cannot be easily mapped and reused in multiple DSS projects. In this paper we propose an object-oriented (OO) approach and an OO database schema to accomplish the DSS implementation which require a multidimensional modeling at the conceptual level using fundamental concepts of OO (class, attribute, method, object, polymorphism, inheritance, hierarchy) to represent all multidimensional properties of a data warehouse, at both the structural and dynamic levels.

Keywords: Decision support systems (DSS), SQL optimization, integration, object-oriented modeling, object-oriented databases

Title of the Paper: Grammatical Inference Methodology for Control Systems


Authors: Aboubekeur Hamdi-Cherif, Chafia Kara-Mohammed

Abstract: Machine Learning is a computational methodology that provides automatic means of improving programmed tasks from experience. As a subfield of Machine Learning, Grammatical Inference (GI) attempts to learn structural models, such as grammars, from diverse data patterns, such as speech, artificial and natural languages, sequences provided by bioinformatics databases, amongst others. Here we are interested in identifying artificial languages from sets of positive and eventually negative samples of sentences. The present research intends to evaluate the effectiveness and usefulness of grammatical inference (GI) in control systems. The ultimate far-reaching goal addresses the issue of robots for self-assembly purposes. At least two benefits are to be drawn. First, on the epistemological level, it unifies two apparently distinct scientific communities, namely formal languages theory and robot control communities. Second, on the technological level, blending research from both fields results in the appearance of a richer community, as has been proven by the emergence of many multidisciplinary fields. Can we integrate diversified works dealing with robotic self-assembly while concentrating on grammars as an alternative control methodology? Our aim is to answer positively this central question. As far as this paper is concerned, we set out the broad methodological lines of the research while stressing the integration of these different approaches into one single unifying entity.

Keywords: Machine learning, robot control, grammatical inference, graph grammar, formal languages for control, self-assembly, intelligent control, emergent control technologies

Title of the Paper: Web Service-Driven Framework for Maintaining Global Version Consistency in Distributed Enterprise Portal


Authors: Hui-Ling Lin, Shao-Shin Hung, Derchian Tsaih, Chiehyao Chang

Abstract: The explosion of the web has led to a situation where a majority of the traffic on the Internet is web related. Today, practically all of the popular web sites arc served from single locations. This necessitates frequent long distance network transfers of data (potentially repeatedly) which results in a high response time for users, and is wasteful of the available network bandwidth. This paper presents a new approach to web replication, where each of the replicas resides in a different part of the network, and the browser is automatically and transparently directed to the “best” server. This paper presents a transnational hierarchical global patch consistent model called THGPCM (Transnational Hierarchical Global Patch Consistent Model). Improving the OPDS (Original Patching Data Source) enabled network equipments, capable of updating patch parameter that existed in enterprise with specified OPDS partially dependency relationships. Apply scenario can reduce the global patch service cost of transnational enterprise network equipments and minimum the turnaround time of patch service delay.

Keywords: Service-oriented, patch, consistency, routing, distributed

Title of the Paper: An Effective Sampling Method for Decision Trees Considering Comprehensibility and Accuracy


Authors: Hyontai Sug

Abstract: Because the target domain of data mining using decision trees usually contains a lot of data, sampling is needed. But selecting proper samples for a given decision tree algorithm is not easy, because each decision tree algorithm has its own property in generating trees and selecting appropriate samples that represent given target data set well is difficult. As the size of samples grows, the size of generated decision trees grows with some improvement in error rates. But we cannot use larger and larger samples, because it’s not easy to understand large decision trees and data overfitting problem can happen. This paper suggests a progressive approach in determining a proper sample size to generate good decision trees with respect to generated tree size and accuracy. Experiments with two representative decision tree algorithms, CART and C4.5 show very promising results.

Keywords: Decision trees, proper sample size determination

Title of the Paper: Semantic Networks-Based Teachable Agents in an Educational Game


Authors: Harri Ketamo

Abstract: The aim of the study was to apply teachable agents into educational game meant for children less than 12 years of age and evaluate the outcome in context of cognitive psychology of learning. The study was done in two phases: design phase and evaluation phase (N=300). The design of the game was done in order to support relatively free use of teachable agents in an easy-to-use environment. The main findings of the study were a clear causality between quality of the taught semantic networks in game world and players' knowledge in real life and an evidence that learning away is an important feature when trying to enable conceptual change in educational games.

Keywords: Game AI, Reinforcement learning, Conceptual learning, Semantic networks, Serious games

Title of the Paper: Higher Education ERPs: Implementation Factors and their Interdependecies


Authors: Ana-Ramona Bologa, Mihaela Muntean, Gheorghe Sabau, Iuliana Scorta

Abstract: In this paper we have analyzed some critical implementation factors of an ERP project implementation in universities and their interdependencies. Taking into consideration that for industry implementations there are already many performed studies we started by considering university implementations as a particular case for the industry ones. Starting from this, we have identified and analyzed differences for the case of universities regarding communication structure, management involvement, organization, implementation team competences, legacy systems, inter-department communication, user training, suppliers/ customers’ partnership, external consultants. The conclusions of this study are going to be used in developing an evaluation framework of ERP solutions for higher education management.

Keywords: ERP systems, critical success factors, higher education, university management

Title of the Paper: Interactive Natural Language Interface


Authors: Faraj A. El-Mouadib, Zakaria Suliman Zubi, Ahmed A. Almagrous, I. El-Feghi

Abstract: To override the complexity of SQL, and to facilitate the manipulation of data in databases for common people (not SQL professionals), many researches have turned out to use natural language instead of SQL. The idea of using natural language instead of SQL has prompted the development of new type of processing method called Natural Language Interface to Database systems (NLIDB). The NLIDB system is actually a branch of more comprehensive method called Natural Language Processing (NLP). In general, the main objective of NLP research is to create an easy and friendly environment to interact with computers in the sense that computer usage does not require any programming language skills to access the data; only natural language (i.e. English) is required. Many systems have been developed to use the concept of NLP in different varieties of domains, for example the system LUNAR [19] and the system LADDER [8]. One drawback of previous systems is that the grammar must be tailor-made for each given database. Another drawback is that many NLP systems cover only a small domain of the English language questions. In this paper we present the design and implementation of a natural language interface to a database system. The system is called Generic Interactive Natural Language Interface to Databases (GINLIDB). It is designed by the use of UML and developed using Visual Basic.NET-2005. Our system is generic in nature given the appropriate database and knowledge base. This feature makes our system distinguishable.

Keywords: SQL, NLP, database, UML, NLIDB, DBMS, ATN

Title of the Paper: Latent Knowledge Structures of Traversal Behavior in Hypertext Environment


Authors: Perwaiz B. Ismaili

Abstract: In this paper, we introduce “Knowledge Diagraph Contribution” (KDC) analysis as a novel categorical time-series method in observing underlying traversal knowledge structure of experts by exploiting varying hypertext (web) presentation formats and knowledge domains. The navigation behaviors were studied by designing hypertext presentation formats and domain text that adheres to content design principles inspired by discourse and text comprehension scholars. As a continuation of previous study by Ismaili & Golden [1], twenty undergraduate psychology students from University of Texas at Dallas participated in this study. Students traversed through different Hypertext (web) presentation formats while reading content from three different knowledge domains controlled for micro (web-page, web-site) and macro (consistent semantic connections across knowledge domains) characteristics. The influence of expertise and web traversal behavior in deriving underlying knowledge structures is presented using KDC analysis. In addition, previously reported Classical data analysis (ANOVA) are compared with KDC analysis in highlighting quantitative and qualitative differences of these derived latent knowledge structures. As compared with novice, experts tend to exhibit sequential and semantic traversal patterns across all three web formats, whereas, novices are more influenced by and therefore tend to employ random navigation strategies.

Keywords: Hypertext, Knowledge Structures, Navigation, Traversal Patterns, Content Design, Expertise

Title of the Paper: Performance Assessment of Search Management Agent under Asymmetrical Problems and Control Design Applications


Authors: Jukkrit Kluabwang, Deacha Puangdownreong, Sarawut Sujitjorn

Abstract: The article presents the performance evaluation of the management agent (MA) containing the adaptive tabu search (ATS) as its search core. In particular, asymmetrical surface optimization problems have been considered. It has been found that symmetrical property of the problems has a significant effect on search performance of the ATS, but MA(ATS). As an average, the MA(ATS) is about 2 times faster than the ATS under both symmetrical and asymmetrical problems. The article also gives reviews on the ATS and the MA(ATS) algorithms. An application on controller design for a coupled system with three-degree-of-freedom is also elaborated.

Keywords: Asymmetrical surface optimization problems, adaptive tabu search, management agent

Title of the Paper: Redundancy and its Applications in Wireless Sensor Networks: A Survey


Authors: Daniel-Ioan Curiac, Constantin Volosencu, Dan Pescaru, Lucian Jurca, Alexa Doboli

Abstract: In this paper we presented and classified various approaches for redundancy in the area of wireless sensor networks, related to sensing, communication and information processing. Sometimes an ally, sometimes a foe, redundancy is an inherent feature of sensor networks that has to be very carefully examined in order to improve important aspects of their functioning. Moreover, this paper presents two methodologies: one that implies both components of spatial redundancy and one that implies the use of temporal redundancy for achieving the objective of fault-tolerant and safe operation. In the end, the fields in which the redundancy could be applied with significant results are highlighted.

Keywords: Wireless sensor networks, redundancy, sensing coverage, communication

Title of the Paper: Testing Inertial Sensor Performance as Hands-Free Human-Computer Interface


Authors: Josip Music, Mojmil Cecic, Mirjana Bonkovic

Abstract: The paper introduces hands-free human-computer interface designed around commercially available inertial sensor pack. It is primarily intended to provide computer access for people with little or no upper-limb functionality, but can be used by able bodied subject in certain application scenarios. The performance of the proposed device was evaluated on twelve healthy subjects performing multi-directional point-and-select task with throughput as the main performance parameter. The system was tested using two different pointer control schemes, as well as three selection techniques. Test subjects were given two questionnaires (one per control scheme) in order to provide comfort assessment of the device and short post-measurement interviews with test subjects provided user feedback. Obtained performance and comfort assessment results are presented and discussed.

Keywords: Inertial sensors, head-joystick, throughput, performance evaluation, human-computer interface

Title of the Paper: A Mobile Location-based Information Recommendation System Based on GPS and WEB2.0 Services


Authors: Fan Yang, Zhi-Mei Wang

Abstract: Combining the GPS location-based services and the latest Web2.0 technologies, this paper builds a scalable personalized mobile information pushing platform, which can provide user-friendly and flexible location-based service. We first propose a Location-based Data and Service Middleware based on Service-Oriented Architecture in order to implement Mobile Information Pushing System involved in a variety of formats of data integration and conversion, as well as a combination of a wide range of services. Then, we propose a novel 3-D Tag-Cloud module, so that it can visualize useful retrieval information even in the limited mobile screen. Especially, we design a multi-dimensional collaborative filtering algorithms, in order to achieve dynamic personalized recommendation and mobile information sharing. Cooperating with some restuarants, we also develop a dynamic restaurant mobile location-based recommendation and discount counpons pushing system. The successful application of the application system do show the efficiency of our ideas.

Keywords: Mobile Information Pushing, GPS, Web2.0, Location Based Service (LBS), Tagging, Collaborative Filtering, Personalized Recommendation

Title of the Paper: Schedule Risk Management for Business M-Applications Development Projects


Authors: Paul Pocatilu, Marius Vetrici

Abstract: The grand majority of software development projects are known to be late and over the budget. Several surveys performed during the last 15 years expose a relatively poor performance in delivering successful software projects. Most of the projects hit schedule and budget overruns of 25% to 100% and sometimes even more. Even though m-applications development is a new software development field, still this type of projects is not secured against the common flaws of software development projects. Therefore, the main goal of this paper is to reduce the gap between the estimated duration of the m-application development project and the actual elapsed time. We find that legacy and proven best practices project management techniques can be successfully employed for schedule risk management. Furthermore, we present three proven software project management techniques that were successfully adapted to the development of m-applications. The first one is the estimation of m-application project duration using top-down and bottom-up approaches. The second one is the use of a set of performance metrics for project quality assessment. And the last one is the Extended Metrix model, a stochastic project duration estimation model with schedule risk analysis elements.

Keywords: Mobile applications, software development, project duration, schedule risk management, Monte Carlo simulation

Issue 5, Volume 8, May 2009

Title of the Paper: Dealings with Problem Hardness in Genetic Algorithms


Authors: Stjepan Picek, Marin Golub

Abstract: Genetic algorithms (GA) have been successfully applied to various problems, both artificial as well as real-world problems. When working with GAs it is important to know those those kinds of situations when they will not find the optimal solution. In other words, to recognize problems that are difficult for a GA to solve. There are various reasons why GAs will not converge to optimal solutions. By combining one or more of these reasons a problem can become a GA-hard problem. Today, there are numerous methods for solving GA-hard problems; every measure has its specific advantages and drawbacks. In this work the effectiveness of one of these measures is evaluated, namely the Negative Slope Coefficient (NSC) measure. A different measure is proposed, called the New Negative Slope Coefficient (NNSC) measure, which aims to address certain drawbacks of the original method. Possible guidelines for further development of this, and comparable methods are proposed.

Keywords: Genetic Algorithm, Unitation, Fitness Landscape, Negative Slope Coefficient, Hardness, Difficulty, Deception

Title of the Paper: Encrypting Messages with Visual Key


Authors: Dusmanescu Dorel

Abstract: This paper presents a method for encrypting messages, which uses for this operation an image or a part of an image as an encryption/decryption key. The need to encrypt messages for protecting their content knew an increase from the apparition of Internet and electronic transactions. The great number of messages that flows in a computer network and over the Internet is requiring very fast encrypt/decrypt processes for assuring real-time secured communications. Complex algorithms used in present for encrypting/decrypting messages use complex keys and complex encrypt/decrypt functions that need an increased computer power for real-time applications (media).

Keywords: Message, image, key, encrypting, decrypting, visual key

Title of the Paper: Analysis of M-Learning Applications Quality


Authors: Catalin Boja, Lorena Batagan

Abstract: The exponential growth of mobile technology in recent years, increasing availability of network infrastructures, advances in wireless technologies and popularity of handheld devices, have opened up new accessibility opportunities for education. In his research Kinshuk (2003), [2], concludes that the true potential of e-learning as “anytime and anywhere” has finally started to be realized with the advent of mobile learning, m-learning. A characteristic of today’s society is the increasing use of modern information and communication technologies in all areas. Investment in mobile device is an important step to improve the quality of life in our dynamic society. The paper analyses the impact of m-learning on the educational process and describes software characteristics of m-learning applications. There are taken into consideration the fields that define m-learning processes as education, technology and software development. For the evaluation of characteristics levels are described metrics and measured values are used to determine the m-learning application overall quality level. The value is determined by aggregating each factor value and taking into consideration importance coefficient.

Keywords: Quality, software, m-learning, mobile learning, characteristic, mobile devices, mobile technologies, handheld device

Title of the Paper: A Persistent Cross-Platform Class Objects Container for C++ and wxWidgets


Authors: Michal Bliznak, Tomas Dulik, Vladimir Vasek

Abstract: This paper introduces a new open-source cross-platform software library written in C++ programming language which is able to serialize and deserialize hierarchically arranged class instances and their data members via XML files. The library provides easy and efficient way for processing, storing and managing complex object-oriented data with relationships between object instances. The library is based on mature cross-platform toolkit called wxWidgets and thus can be successfully used on many target platforms such as MS Windows, Linux or OS X. The library is published under open source licence and can be freely utilised in both open source and commercial projects. In this article, we describe the inner structure of the library, its key algorithms and principles and also demonstrate its usage on a set of simple examples.

Keywords: Data, class, persistence, container, serialization, XML, tree, list, C++, wxWidgets, wxXmlSerializer, wxXS

Title of the Paper: An RDF-based Distributed Expert System


Authors: Napat Prapakorn, Suphamit Chittayasothorn

Abstract: An expert system or knowledge-based system comprises of a knowledge base and an inference engine in which their expertise knowledge is represented. The knowledge can be called upon when needed to solve a problem by the inference engine. In a large expert system, the knowledge base can be represented using frames. Hence they are called frame-based expert system or frame-based system. To avoid too many communication traffics during inferences, a distributed expert system, an expert system with an inference engine on its external knowledge base side is presented. It is an expert system which has an RDF external knowledge base for improved flexibility and mobility in knowledge sharing. This research presents a design and implementation of a Frame-based RDF expert system which has a RDF/XML database as its external knowledge base. The external knowledge base uses frame as its knowledge representation stored in RDF/XML format so that it can be placed on the WWW (World Wide Web) which is, ideally, accessible from anywhere. With this capability, an expert system will be enriched with flexibility and mobility in knowledge sharing.

Keywords: Frames, Expert system, Knowledge base, RDF, RDF/XML, Ontology

Title of the Paper: Design of an Intrusion Detection System Based on Bayesian Networks


Authors: Milan Tuba, Dusan Bulatovic

Abstract: This paper describes a structure of a standalone Intrusion Detection System (IDS) based on a large Bayesian network. To implement the IDS we develop the design methodology of large Bayesian networks. A small number of natural templates (idioms) are defined which make the design of Bayesian network easier. They are related to specific fragments of Bayesian networks representing the basic elements in reasoning about uncertain events. The idioms represent the graphical structure, without the probabilistic tables. The use of idioms speeds-up the development of Bayesian networks and improves their quality. Example network is constructed and examined. Such Bayesian network can represent an independent agent in a distributed system. Results are promising since with very limited computation and low sensitivity to the quality of prior knowledge, potentially dangerous situations are successfully detected and classified.

Keywords: Privacy, Security, Networks, Data protection, Bayesian network, Intrusion detection system (IDS)

Title of the Paper: Information Architecture for Intelligent Decision Support in Intensive Medicine


Authors: M. F. Santos, F. Portela, M. Vilas-Boas, J. Machado, A. Abelha, J. Neves, A. Silva, F. Rua

Abstract: Daily, a great amount of data that is gathered in intensive care units, which makes intensive medicine a very attractive field for applying knowledge discovery in databases. Previously unknown knowledge can be extracted from that data in order to create prediction and decision models. The challenge is to perform those tasks in real-time, in order to assist the doctors in the decision making process. Furthermore, the models should be continuously assessed and optimized, if necessary, to maintain a certain accuracy. In this paper we propose an information architecture to support an adjustment to the INTCare system, an intelligent decision support system for intensive medicine. We focus on the automatization of data acquisition avoiding human intervention, describing its steps and some requirements.

Keywords: Real-time data acquisition, knowledge discovery in databases, intensive care, INTCare, intelligent decision support systems, information models.

Title of the Paper: A Grid Data Mining Architecture for Learning Classifier Systems


Authors: M. F. Santos, W. Mathew, T. Kovacs, H. Santos

Abstract: Recently, there is a growing interest among the researchers and software developers in exploring Learning Classifier System (LCS) implemented in parallel and distributed grid structure for data mining, due to its practical applications. The paper highlights the some aspects of the LCS and studying the competitive data mining model with homogeneous data. In order to establish more efficient distributed environment, in the current work, Grid computing architecture is considered a better distributed framework in Supervised Classifier System (UCS). The fundamental structure of this work allows each site of the distributed environment to manage independent UCS and such local sites transmit learning models to the global model for making complete knowledge of the problem. The Boolean 11-multiplexer problems are used for the execution. Hence, the main objective of this work is to keep the average accuracy of distributed mode without loosing accuracy rate compared to models. The experimental results showed that the testing accuracy of distributed mode is higher than other models.

Keywords: Learning Classifier Systems, UCS, Genetic Algorithm, Fitness, Accuracy, Data Mining, Grid computing, Cloud computing, Grid Data Mining

Title of the Paper: The Effects of Data Mining in ERP-CRM Model – A Case Study of Madar


Authors: Abdullah S. Al-Mudimigh, Farrukh Saleem, Zahid Ullah

Abstract: As Enterprise Resource Planning (ERP) implementation has become more popular and suitable for every business organization, it has become a essential factor for the success of a business. This paper shows the best integration of ERP with Customer Relationship Management (CRM). Data Mining is overwhelming the integration in this model by giving support for applying best algorithm to make the successful result. This model has three major parts, outer view-CRM, inner view-ERP and knowledge discovery view. The CRM collect the customer’s queries, EPR analyze and integrate the data and the knowledge discovery gave predictions and advises for the betterment of an organization. For the practical implementation of presented model, we use MADAR data and implemented Apriori Algorithm on it. Then the new rules and patterns suggested for the organization which helps the organization for solving the problem of customers in future correspondence.

Keywords: Apriori Algorithm, CRM, ERP

Title of the Paper: Parallel Difference Method on Diffusion Equations


Authors: Qinghua Feng, Bin Zheng

Abstract: In this paper, we present a high order unconditionally stable implicit scheme for diffusion equations. Based on the scheme a class of parallel alternating group explicit method is derived, and stability analysis is given. Then we present another parallel alternating group explicit iterative method, and finish the convergence analysis. Numerical experiments show that the two methods are of higher accuracy than the original alternating group method.

Keywords: Diffusion equation, parallel computation, finite difference, iterative method, alternating group

Title of the Paper: Alternating Group Explicit-Implicit Method and Crank-Nicolson Method for Convection-Diffusion Equation


Authors: Qinghua Feng, Bin Zheng

Abstract: Based on the concept of alternating group and domain decomposition, we present a class of alternating group explicit-implicit method and an alternating group Crank-Nicolson method for solving convection-diffusion equation. Both of the two methods are effective in convection dominant cases. The concept of the construction of the methods is also be applied to 2D convection-diffusion equations. Numerical results show the present methods are superior to the known methods in [6,11,16].

Keywords: Convection-diffusion equation, finite difference, parallel computation, exponential-type transformation, alternating group

Title of the Paper: Online Slant Signature Algorithm Analysis


Authors: Azlinah Mohamed, Rohayu Yusof, Sofianita Mutalib, Shuzlina Abdul Rahman

Abstract: A vector rule-based approach and analysis to on-line slant signature recognition algorithm is presented. Extracting features in signature is an intense area due to complex human behavior, which is developed through repetition. Features such as direction, slant, baseline, pressure, speed and numbers of pen ups and downs are some of the dynamic information signature that can be extracted from an online method. This paper presents the variables involve in designing the algorithm for extracting the slant feature. Signature Extraction Features System (SEFS) is used to extract the slant features in signature automatically for analysis purposes. The system uses both local and global slant characteristics in extracting the feature. Local slant is the longest slant among the detected slant while the global slant represents the highest quantity of classified slant whether the slant are leftward, upright or rightward. Development and analysis are reported on a database comprises of 20 signatures from 20 subjects. The system is compared to human expert evaluation. The results demonstrate a competitive performance with 85% accuracy.

Keywords: Slant feature, Online signature, Signature recognition, Signature analysis, Dynamic signature

Title of the Paper: A Magneto-statics Inspired Transform for Structure Representation and Analysis of Digital Images


Authors: Xiao-Dong Zhuang, Nikos E. Mastorakis

Abstract: Physical-field inspired methodology has become a new branch in image processing techniques. In this paper, a novel image transform is proposed imitating the source reverse of magneto-static field. The image is taken as a vertical magnetic field, and its curl is estimated as the virtual source of the field for image structure representation and analysis. The restoration from the virtual source to the image is also investigated, based on which a method of image estimation and restoration from its gradient field is proposed. The experimental results indicate that the proposed curl source reverse provides effective image structure representation, which can be exploited in further image processing tasks.

Keywords: Image transform, curl source reverse, magneto-static, image structure, image restoration

Title of the Paper: Design and Evaluation of FPGA Based Hardware Accelerator for Elliptic Curve Cryptography Scalar Multiplication


Authors: Kapil A. Gwalani, Omar Elkeelany

Abstract: Embedded systems find applications in fields such as defense, communications, industrial automation and many more. For majority of these applications, security is a vital issue. Over the last decade, security has become the primary concern when discussing e-commerce. The rapid development in the field of information technology has led to the increase in need of techniques capable of providing security. Cryptography plays an important role in providing data security. Until recently, symmetric key encryption schemes were used for a majority of these applications. Now however, asymmetric key encryption schemes such as Elliptic curve cryptography are gaining popularity as they require less computational power and memory and are still capable of providing equivalent security when compared to their counterparts. Elliptic curve cryptography was first introduced in 1985 and has always been around since. Scalar or point multiplication in elliptic curve cryptography has been a topic of research interest. Improving the performance of scalar multiplication can improve the overall performance of elliptic curve cryptography. One popular method to improve scalar multiplication is by means of hardware accelerators. The authors of this paper have implemented scalar multiplication, the most time consuming operation in elliptic curve cryptography using binary non-adjacent form algorithm. The results of the software implementation have been presented in section- 4. Methodology to improve the performance of the scalar multiplication by use of hardware accelerators has also been presented in this paper.

Keywords: Binary Non-adjacent Form, ECC, Prime Field, System on Programmable Chip

Issue 6, Volume 8, June 2009

Title of the Paper: Efficient Message Authentication Protocol for WSN


Authors: Moises Salinas Rosales, Gina Gallegos Garcia, Gonzalo Duchen Sanchez

Abstract: This paper describes a solution for nodes and message authentication problems in wireless sensor networks, this solution allows effectively avoiding node-impersonation and messaging falsification among the WSN nodes. The resulting protocol address authentication at two level using identity based cryptography and message authentication codes with SHA-1, for node and message authentication respectively. An implementation of the message authentication process into a TinyOS-based node is presented; also power consumption measurements obtained are discussed. Based in experimental results we show that message authentication process is suitable in terms of power consumption.

Keywords: WSN, Authentication, MAC codes, Cryptography, Pairings, TinyOS

Title of the Paper: Neural Network based Attack Detection Algorithm


Authors: Araceli Barradas-Acosta, Eleazar Aguirre Anaya, Mariko Nakano-Miyatake, Hector Perez-Meana

Abstract: The influence of computer technology on the human activities has greatly increased during the last three decades, due to major developments in the VLSI technology. However this widespread use of computer equipments has generated computer a considerable increase of computer crimes. To reduce this problem it is necessary to carried out a network analysis using the computer network traffic. However the increase of network traffic is huge, doing the analysis of traffic data complicated. Thus it is required to develop an effective and automatic algorithm to carry out the traffic network analysis, facilitating I such way the expert forensic work. This paper proposes a network analysis algorithm using recurrent neural network that can analyze computer network attacks facilitating the evidence extraction. Proposed algorithm can reduce time and cost of forensic.

Keywords: Computer network, Forensic analysis, Recurrent neural network, Forensic, Attacks

Title of the Paper: A Dual Simulation Environment for Simulating MAS in Telecommunication Networks


Authors: Nada Meskaoui, Dominique Gaiti, Karim Y. Kabalan

Abstract: This paper presents a dual simulation environment for simulating different type of telecommunication networks integrating intelligent agents. Agents are considered as advanced tools for resolving complex issues in networking based on intelligent and dynamic features. To test the efficiency of these proposals, new simulation environments, integrating both agents and network components, are required. In this paper we propose an extension with intelligent capabilities to a networking platform. This dual simulation environment has been tested for implementing agents in a DiffServ network to improve its performance. Simulation results show the efficiency of integrating agents within telecommunication networks and also prove that such a dual simulation environment is needed to test new techniques based on agents and multi-agent systems in networking.

Keywords: Artificial intelligence, telecommunication, networks, single-agent systems, multi-agent systems, Diffserv network, platforms

Title of the Paper: Improving the Accuracy of Effort Estimation through Fuzzy Set Combination of Size and Cost Drivers


Authors: Ch. Satyananda Reddy, Kvsvn Raju

Abstract: In this research, it is investigated the precision of size and cost drivers in the estimation of effort using Constructive Cost Model (COCOMO). It is imperative to stress that uncertainty at the input level of the COCOMO yields uncertainty at the output, which leads to gross estimation error in the effort estimation. Instead of using a single number to represent the size, it can be characterized as a fuzzy value. Cost drivers also expressed through an unclear category which needs subjective assessment. Fuzzy logic has been applied to the COCOMO using the symmetrical triangles and trapezoidal membership functions to represent the cost drivers and size. Using trapezoidal membership function for the size and cost drivers, a few attributes are assigned the maximum degree of compatibility when they should be assigned lower degrees. To overcome the above limitations, in this work, it is concentrated to use Gaussian membership function for the COCOMO parameters. In addition, this paper proposes to incorporate both size and cost drivers together, with a fuzzy set using Gaussian membership function. The present work is based on COCOMO dataset and the experimental part of the study illustrates the approach and compares it with the standard version of the COCOMO. It has been found that the proposed method is performing better than ordinal COCOMO and the achieved results were closer to the actual effort.

Keywords: Constructive Cost Model, COCOMO, Fuzzy based Effort Estimation, Gaussian Membership Function, Software Cost Estimation, Software Effort Estimation, Software Size and Project Management.

Title of the Paper: Agent-Based Mediation System to Facilitate Cooperation in Distributed Design


Authors: Victoria Eugenia Ospina, Alain-Jerome Fougeres

Abstract: In this article we present the use of knowledge for a Mediation System, developed to give support to participants in mechanical-system-designer activities. To use a cooperative system sufficient assistance is needed to facilitate and coordinate actors’ activities. To accomplish this goal we introduce an artificial actor: Mediator. The Mediator forms part of the group of collaborative, with the specific role of facilitating the cooperative activity. This role of assistance, differentiate the Mediator from other actors. This one is endowed with specific skills of cooperation (communication; awareness, coordination, co-memorization), requiring some acquired knowledge, which allow them to give assistance to the human actors. We will define the types of knowledge defined for our proposed Mediation System. Then we will illustrate the use of memorized knowledge by the Mediator during an activity of technical functional analysis.

Keywords: CSCW, Intelligent Agent System, Knowledge and Data technology, Mediation System

Title of the Paper: A Wavelet-Based Voice Activity Detection Algorithm in Variable-Level Noise Environment


Authors: Kun-Ching Wang

Abstract: In this paper, a novel entropy-based voice activity detection (VAD) algorithm is presented in variable-level noise environment. Since the frequency energy of different types of noise focuses on different frequency subband, the effect of corrupted noise on each frequency subband is different. It is found that the seriously obscured frequency subbands have little word signal information left, and are harmful for detecting voice activity segment (VAS). First, we use bark-scale wavelet decomposition (BSWD) to split the input speech into 24 critical subbands. In order to discard the seriously corrupted frequency subband, a method of adaptive frequency subband extraction (AFSE) is then applied to only use the frequency subband. Next, we propose a measure of entropy defined on the spectrum domain of selected frequency subband to form a robust voice feature parameter. In addition, unvoiced is usually eliminated. An unvoiced detection is also integrated into the system to improve the intelligibility of voice. Experimental results show that the performance of this algorithm is superior to the G729B and other entropy-based VAD especially for variable-level background noise.

Keywords: Voice activity detection, Bark-scale wavelet decomposition, Adaptive frequency subband extraction

Title of the Paper: Research on Protocol-Level Behavioral Substitutability of Software Components in Component-based Software System


Authors: Haiyang Hu, Hua Hu

Abstract: The component-based software development (CBSD) has been paid more attention by software practicers in recent years. How to analyze and verify behavior-level component substitutability is very important when the component-based software system needs upgrading or maintaining. Concentrating on the component-based software system, this paper formally specifies the components and their interaction behaviors, analyzes the behavior of the new component compared with the old one, and then presents a set of rules for verifying behavioral substitutability of components in software system to ensure the behavioral compatibility whenever a component is replaced by a new one. Finally, an example of e-commerce is presented to illustrate the feasibility and pertinence of this approach.

Keywords: Software component component composition behavioral compatibility behavioral substitutability

Title of the Paper: A Hybrid Approach for Indexing and Searching Protein Structures


Authors: Tarek F. Gharib

Abstract: Searching for structural similarities of proteins has a central role in bioinformatics. Most tasks of bioinformatics depends on investigating the homologous protein's sequence or structure these tasks vary from predicting the protein structure to determine sites in protein structure where drug can be attached. Protein structure comparison problem is extremely important in many tasks. It can be used for determining function of protein, for clustering a given set of proteins by their structure, for assessment in protein fold prediction. Protein Structure Indexing using Suffix Array and Wavelet (PSISAW) is a hybrid approach that provides the ability to retrieve similarities of proteins based on their structures. Indexing the protein structure is one approach of searching for protein similarities. The suffix arrays are used to index protein structure and the wavelet is used to compress the indexed database. Compressing the indexed database is supposed to make the searching time faster and memory usage lower but it affects the accuracy with accepted rate of error.The experimental results, which are based on the structural classification of proteins (SCOP) dataset, show that the proposed approach outperforms existing similar techniques in memory utilization and searching speed. The results show an enhancement in the memory usage with factor 50%.

Keywords: Protein structures, indexing, suffix array, wavelet

Title of the Paper: Symbolic Algorithmic Verification of Generalized Noninterference


Authors: Conghua Zhou

Abstract: In this paper we propose an algorithmic verification technique to check generalized noninterference. Our technique is based on the counterexamples search strategy mainly which generating counterexamples of minimal length. In order to make the verification procedure terminate as soon as possible we also discuss how to integrate the window induction proof strategy in our technique. We further show how to reduce counterexamples search and induction proof to quantified propositional satisfiability. This reduction enables us to use efficient quantified propositional decision procedures to perform generalized noninterference checking.

Keywords: Generalized noninterference, Quantified propositional satisfiability, Multilevel security

Title of the Paper: DRILA: A Distributed Relational Inductive Learning Algorithm


Authors: Saleh M. Abu-Soud, Ali Al-Ibrahim

Abstract: This paper describes a new rule discovery algorithm called Distributed Relational Inductive Learning DRILA, which has been developed as part of ongoing research of the Inductive Learning Algorithm (ILA) [11], and its extension ILA2 [12] which were built to learn from a single table, and the Relational Inductive Learning Algorithm (RILA) [13], [14] which was developed to learn from a group of interrelated tables, i.e. a centralized database. DRILA allows discovery of distributed relational rules using data from distributed relational databases. It consists of a collection of sites, each of which maintains a local database system, or a collection of multiple, logically interrelated databases distributed over a computer network. The basic assumption of the algorithm is that objects to be analyzed are stored in a set of tables that are distributed over many locations. Distributed relational rules discovered would either be used in predicting an unknown object attribute value, or they can be used to extract the hidden relationship between the objects' attribute values. The rule discovery algorithm, developed, was designed to use data available from many locations (sites), any possible ‘connected’ schema at each location where tables concerned are connected by foreign keys. In order to have a reasonable performance, the ‘hypotheses search’ algorithm was implemented to allow construction of new hypotheses by refining previously constructed hypotheses, thereby avoiding the work of re-computing. Unlike many other relational learning algorithms, the DRILA algorithm does not need its own copy of distributed relational data to process it. This is important in terms of the scalability and usability of the distributed relational data mining solution that has been developed. The architecture proposed can be used as a framework to upgrade other propositional learning algorithms to relational learning.

Keywords: Distributed Relational Rule Induction, Rule Selection Strategies, Inductive Learning, ILA, ILA2, RILA, DRILA

Title of the Paper: Research on Cyclic Mapping Model and Solving Approach for Conceptual Design


Authors: S. Zhang

Abstract: An ideal conceptual design model should support multi-level innovative design through rational mapping layer and mapping relationship. In this paper, five existing conceptual design models are reviewed from this perspective, and a new model which supports the alternation of cyclic mappings among functional decomposition,functional solving and combination of solutions in turn is presented. Then the knowledge representation scheme for principle solving and the feature-based scheme for interface representation are discussed. Finally a cyclic solving approach to conceptual design is put forward, and an interface matrix is applied to facilitate computer processing.

Keywords: Computer aided design, Conceptual design model, Knowledge representation scheme, Design catalogue, Feature model, Interface matrix

Title of the Paper: RFID based Learning Assessment System


Authors: V. Drona, S. Drona, C. Rusell, M. H. N. Tabrizi

Abstract: Learning Assessment or traditional testing is typically performed with the participation of a proctor, who must be physically present in the classroom at all times. Protecting the integrity of learning assessments, in any form, often involves complicated procedures and adjustments to the assessment process and environment. The Radio Frequency Identification (RFID) based Learning Assessment System (RLAS) is aimed at removing the need for a human proctor and reducing the complexity involved in the assessment. This is particularly important when assessment is more complex than a traditional pencil and paper variety. RLAS integrates RFID tags, Pocket PC (PDA’s) and a server to create a complete learning assessment system. RFID tags are used to track students’ movements and PDAs are used as communication devices. Students are then monitored by a virtual proctor on a server. In this paper the development of the RLAS system is described and its use as a tool to develop and execute individualized learning environments for students in K-12 is considered. The system integrates audio, video and common textual learning tools depending on the student’s location and learning style needs. The system further assesses the effectiveness of the chosen tools in the student’s learning process.

Keywords: Learning Assessment, Proctor, RFID, Sensors, Communication

Title of the Paper: Solving Traveling Salesman Problem on Cluster Compute Nodes


Authors: Izzatdin A. Aziz, Nazleeni Haron, Mazlina Mehat, Low Tan Jung, Aisyah Nabilah Mustapa, Emelia Akashah Patah Akhir

Abstract: In this paper, we present a parallel implementation of a solution for the Traveling Salesman Problem (TSP). TSP is the problem of finding the shortest path from point A to point B, given a set of points and passing through each point exactly once. Initially a sequential algorithm is fabricated from scratch and written in C language. The sequential algorithm is then converted into a parallel algorithm by integrating it with the Message Passing Interface (MPI) libraries so that it can be executed on a cluster computer. Our main aim by creating the parallel algorithm is to accelerate the execution time of solving TSP. Experimental results conducted on Beowulf cluster are presented to demonstrate the viability of our work as well as the efficiency of the parallel algorithm.

Keywords: Traveling Salesman Problem (TSP), High Performance Computing (HPC), Message Passing Interface (MPI)

Title of the Paper: The Investigation of Discovering Potential Musical Instruments Teachers by Effective Data Clustering Scheme


Authors: Cheng-Fa Tsai, Yu-Tai Su, Chiu-Yen Tsai, Chun-Yi Sung

Abstract: Data clustering plays an important role in various fields. Data clustering approaches have been designed in recent years. This investigation aims to present data clustering algorithm to identify potential musical instruments teachers. With a total of 5125 candidates registered respectively in 9 grades of Taiwan United Music Grade Test during 2000-2008. Moreover, this study proposes a new data clustering algorithm called MIDBSCAN and an existing well-known neural network called self-organizing map (SOM) to perform data clustering applications for discovering potential musical instruments teachers. The processing procedure of searching for neighbors (neighborhood data points) is very time consuming in the existing well-known DBSCAN and IDSCAN algorithms. Therefore, to shorten the time consumed, the proposed MIDBSCAN algorithm focuses lowering the number of expansion seeds added into the neighborhood data in this procedure, thus reducing the time cost of searching for neighbors. According to our simulation results, the proposed MIDBSCAN approach has low execution time cost, a maximum deviation in clustering correctness rate and a maximum deviation in noise data filtering rate. MIDBSCAN outperforms SOM in execution time cost. It is feasible to perform data clustering analysis in various data mining applications using the proposed MIDBSCAN algorithm.

Keywords: Data clustering, data mining, effective teaching, music examine

Issue 7, Volume 8, July 2009

Title of the Paper: A Nonce-Based Mutual Authentication System with Smart Card


Authors: Chin-Ling Chen, Wei-Chech Lin, Zong-Min Guo, Yung-Fa Huang, Neng-Chung Wang

Abstract: User authentication is an important security mechanism for recognizing legal remote users. We propose an available and secure authentication scheme for service provider to verify users without using verification table. It can resist most of the attacks by improving nonce-based mutual authentication mechanism, and ensure the security by dynamic session key. User may change his password freely. Our scheme compared with other related schemes for security efficiency.

Keywords: Mutual authentication, RSA, smart card

Title of the Paper: Mathematical Modeling and Numerical Analysis of the Programming Field in PEO10LiCF3SO3-Polypyrrole Neural Switch


Authors: Mahmoud Z. Iskandarani

Abstract: The design and numerical modeling using Finite Element Analysis (FEA) of electric field strength in a programmable neural switch is carried out. The obtained model provided good approximation to the derived complex analytical solution, which is carried out by means of complex mathematical analysis employing SCHWARZ-CHRISTOFFEL transform. Effect of electrode separation and field spread in both x and y directions are studied and explained. Boundary effects on field strength representation is discussed and numerically reduced through increasing the number of nodes for each element in the finite grid. Edge effect on field strength is also eliminated using semi-infinite coplanar electrode approximation. Such a switch will function as a synaptic processor behaving in an adaptive manner and suitable to be used as a compact programmable device with other artificial neural network hardware.

Keywords: Neural, Numerical, Finite Element, Mathematical, Memory, Information Processing, Polymers

Title of the Paper: Recent Advances in Data Management


Authors: Zeljko Panian

Abstract: To increase the value of data as a business asset, companies and government organizations need to establish standards, policies, and processes for the usage, development, and management of data, to create the right organizational structure, and to develop the supporting technology infrastructure. As a practice with roots in corporate and information technology (IT) governance, data governance emerged. Data governance can be defined as the processes, policies, standards, organization, and technologies required to manage and ensure the availability, accessibility, quality, consistency, auditability, and security of data in an organization.

Keywords: Information technology (IT), data management, data integration, corporate governance, IT governance, data governance, compliance

Title of the Paper: Toward Developing Medium-long Term Spot Trading E-Transaction Platform for Bulk Agricultural Product in China


Authors: Zheng Xiaoping, Wang Ruimei, Tian Dong, Zhang Xiaoshuan

Abstract: Medium-long term spot trading has developed rapidly in China. This paper describes the feature of medium-long term spot trading, and develops e-commerce platform for bulk agricultural product based on medium-long term spot trading model. Finally, benefit and challenges to medium-long term spot trading are discussed. The establishment of medium-long transactions plays a significant role in the regulation of supply and demand, reducing price volatility and price discovery. However, medium-long spot trading is still at the exploratory stage, there are many problems and challenges.

Keywords: Medium-long term spot trading, E-commerce platform, Bulk agricultural product, China

Title of the Paper: A Study of Mobile E-learning-Portfolios


Authors: Hsieh-Hua Yang, Jui-Chen Yu, Lung-Hsing Kuo, Li-Min Chen, Hung-Jen Yang

Abstract: Based on the applications of e-portfolios, a mobile system infrastructure was designed with the discussion of information flow scenarios. This study intended to figure out the impacts that mobile technology has exerted on the educational system through the portfolio. To fully understand the potential impact of the mobile electronic learning portfolio system on education, institutions need to take into account the current environment within higher education. Evaluating student performances always has been an important issue for academia. This study proposed a four-step-model of mobile e-learning-portfolios. Those are steps of accumulating, revealing, proposing and publishing.

Keywords: Mobile learning, E-learning-Portfolio

Title of the Paper: An Imprecise Computation Framework for Fault Tolerant Control Design


Authors: Addison Rios-Bolivar, Luis Parraguez, Francisco Hidrobo, Margarita Heraoui, Julima Anato, Francklin Rivas

Abstract: In this work the existing relations between the imprecise computation and the fault tolerant control (FTC) are analyzed. From those relations, there are constructed FTC systems according to the model of imprecise computation. The found relations establish that the obligatory tasks in the model of imprecise computation correspond with the mission of maintaining the stability of the systems under FTC according to the redundancy degree that assure the structural properties of the systems. On the other hand, the optional tasks in imprecise computation correspond to performance criteria of the systems under FTC are satisfied, which can be degraded under adverse conditions of operation of the system. These correspondences are probed in an example.

Keywords: Imprecise Computation, Fault Tolerant Control, Real-Time, Fault Detection, Anytime Computation

Title of the Paper: Adaptive Knowledge Discovery for Decision Support in Intensive Care Units


Authors: Pedro Gago, Manuel Filipe Santos

Abstract: Clinical Decision Support Systems (CDSS) are becoming commonplace. They are used to alert doctors about drug interactions, to suggest possible diagnostics and in several other clinical situations. One of the approaches to building CDSS is by using techniques from the Knowledge Discovery from Databases (KDD) area. However using KDD for the construction of the knowledge base used in such systems, while reducing the maintenance work still demands repeated human intervention. In this work we present a KDD based architecture for CDSS for intensive care medicine. By resorting to automated data acquisition our architecture allows for the evaluation of the predictions made and subsequent action aiming at improving the predictive performance thus enhancing adaptive capacities.

Keywords: Clinical Decision Support, Intelligent Decision Support Systems, Knowledge Discovery, Intensive care, Ensembles, Multi-Agent Systems

Title of the Paper: Towards A Utility Based Computational Model for Negotiation between Semantic Web Services


Authors: Sandeep Kumar, Nikos E. Mastorakis

Abstract: Using the utility-theory for the decision-making process during the negotiation between semantic web services is an appealing one. This paper proposes a computational model for the calculation of utilities of the negotiating semantic web services. The proposed model uses multiple attribute in the utility function and uses the basic values of these attributes such as values for offered price, quality and others. The model is based on a novel understanding that a service requester should remain indifferent to the changes in price or other such values if the corresponding quality has also been changed accordingly. A prototype system has been implemented in support of the proposed model. The work has been evaluated and the betterment has been reported.

Keywords: Utility, semantic web, semantic web services, negotiation, computation

Title of the Paper: Information Analysis of Rail Track for Predictive Maintenance


Authors: R. B. Faiz, S. Singh

Abstract: Track defects are deviation of actual from theoretical values of the tracks geometrical characteristics. Track defects are macroscopic and geometric in nature and are exclusively the consequence of train traffic. [5] Rail track maintenance in terms of track geometry and other modalities like rail profile, ultrasonic’s etc has been typically based on use of reactive maintenance. In case of such reactive maintenance discrete exeecedence of track geometry parameter measurements are compared against a pre-set threshold such that if they exceed the threshold then reactive maintenance is required to be done. This is particularly so in cases when current practices rely on conservative, engineering, decision. Besides it will not inform staff regarding defects in rail track which can subsequently helps in predictive maintenance. This paper focuses on variable and time based linear regression analysis of track geometry parameters which can lead to significant predictive maintenance of track geometry. The over all aim of this research paper is to propose a predictive maintenance frame work that would assist in predicting future changes in rail track geometry measurements. Such framework can evaluate and prioritise track geometry maintenance effort across network rail. It will cause alarms before the defects will actually happen. Such research will result in effective and efficient rail track maintenance management resulting in low operating cost better train transit times for rail industry [2].

Keywords: Predictive Maintenance Management, Rail Track Geometry, CURV, Cant Def, Cross Level, Dipped Left, Gauge

Title of the Paper: Implementation and Evaluation of a Peer Assessment System for Enhancing Students' Animation Skills


Authors: Jung-Chuan Yen, Songshen Yeh, Jin-Lung Lin

Abstract: A web-based peer-assessment system for enhancing students' creativity and animation skills is presented. Through a quasi-experiment instruction, this paper was to examine the learning effects of the system with mechanisms of diverse assessment. Participants were 88 sophomores of 2D animation courses taught by the same instructor. A two-way ANOVA was conducted to analyze the independent variables by self-regulation (high vs. low) and perceived initiative (stronger vs. weaker) on project performance as the dependent variables. The results revealed that (a) there are no significantly interactions between self-regulation and perceived initiative, (b) students of the high-group of self-regulation outperformed the low-group on the peer assessment, (c) students who showed stronger initiative outperformed the weaker ones on the tutor assessment, (d) most of the learners showed positive attitude toward peer-assessment.

Keywords: E-learning, Animation skill, Peer-assessment, Self-regulation, Social cognitive

Title of the Paper: Data Attribute Reduction using Binary Conversion


Authors: Fengming M. Chang

Abstract: While learning with data having large number of attribute, a system is easy to freeze or shut down or run for a long time. Therefore, the proposed Binary Conversion (BC) is a novel method to solve this kind of large attribute problem in machine learning. The purpose of BC is to reduce data dimensions by a binary conversion process. All the attributes are reserved but combined into few numbers of new attributes instead of that some attributes are removed. To prevent the information loss problem during the conversion, each binary type data value occupies its own digital position in BC. In addition, 4 data sets: nbuses, ACLP, MONK3, and Buseskod data are used in this study to test and compare the learning accuracies and learning time. The results indicate that the proposed BC can keep about the same level of accuracy but increase the learning efficiency.

Keywords: Binary conversion, Large attribute, Machine learning, Neuro-fuzzy, Mega-fuzzificaiton

Title of the Paper: Performance Analysis for A Web Based Session and Error Management Agent on Home Network Environment


Authors: Eung-Nam, Ko

Abstract: This paper describes a web based session and error management with a whiteboard agent running on home network environment. It suggests a system that is capable of detecting and recovering software error for distributed multimedia with an integrated whiteboard model which supports object drawing, application sharing, and web synchronization methods of sharing information through a common view between concurrently collaborating users for home network environment. DOORAE is a framework of supporting development on multimedia application for distributed multimedia running on home network environment. This paper explains a performance analysis for a session and error management agent based on home network environment by using the rule-based DEVS(Discrete Event System Specification) modeling and simulation techniques. In DEVS, a system has a time base, inputs, states, outputs, and functions. The DEVS formalism introduced by Zeigler provides a means of specifying a mathematical object called a system.

Keywords: A session and error management, a whiteboard agent, home network environment, software error, DEVS modeling, discrete event mode

Title of the Paper: Personal Identification by Finger Vein Images Based on Tri-value Template Fuzzy Matching


Authors: Liukui Chen, Hong Zheng

Abstract: To reduce the effect of fuzzy vein edges and tips of the infrared finger vein recognition, this paper presents a tri-value template fuzzy matching algorithm, which segments the vein feature image into three areas: subject area, fuzzy area and background area, and computes the average distance of non-background point to non-background area as the dissimilarity score between the two templates. The proposed matching method is robust against the fuzzy edges and tips. The experimental results show that the proposed method is feasible and practical by the recognition accuracy rate, 99.46 %, to 456 near-infrared finger vein images.

Keywords: Personal identification; Infrared finger vein; Tri-value template; Fuzzy matching

Title of the Paper: The Relationship of Sample Size and Accuracy in Radial Basis Function Networks


Authors: Hyontai Sug

Abstract: Even though radial basis function networks are known to have good prediction accuracy in several domains, it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples, because we have limited training examples, and there is some fluctuation in accuracy depending on the sample sizes. This paper suggests a progressive resampling technique to cope with the fluction of prediction accuracy values for better radial basis function networks. The suggestion is proved by experiments with promising results.

Keywords: Neural networks, radial basis function networks, sampling

Title of the Paper: Computer Simulation and Experimental Research of the Vehicle Impact


Authors: Daniel Trusca, Adrian Soica, Bogdan Benea, Stelian Tarulescu

Abstract: This paper investigates the vehicle rear impact and its consequences and analyzes the research developed in this field. In road traffic, especially in urban traffic, numerous rear collisions have taken place, resulting in serious injuries for passengers. The numerous studies that have been carried out lead to today’s internationally recognized demand on the car driver’s head restraint adjustment for an optimal protection of the neck spinal column (HWS) with regard to rear end collisions. Mathematical modeling of passenger movement during head impact may be a successful tool in establishing the neck injury mechanism, especially when working in parallel with experimental studies. In order to accomplish experimental studies for data acquisition and video and image samples analysis, the preparation of testing device are needed. Validation mathematical models using experimental tests, offers a vast range of their use. Thus, the simulation on the computer of the phenomena that take place during road events can be studied more comprehensively, and may allow optimize passive safety systems in order to diminish adverse effects of these phenomena.

Keywords: Modeling, collision, simulation, experimentation, safety, validation

Title of the Paper: An Efficient Garment Visual Search based on Shape Context


Authors: Chin-Hsien Tseng, Shao-Shin Hung, Jyh-Jong Tsay, Derchian Tsaih

Abstract: In recent years, the theoretical models of mass consumer behavior have change to buy from websites rather than in stores. Because the high-growing of e-commerce, a new demand emerges: the special-purpose search engine for searching goods from network shop. How to meet the customer’s requirement in product search is an import problem. Although it is easy for human eyes to determine the existence of clothes styles, recognizing it automatically from a computer program is not a trivial problem. Our work focuses on the garment retrieval from the e-shopping database, which supports feature-based retrieval by shape categories and styles. Traditionally the rigid shape-based algorithms unable to apply well on garment images. Because the clothing is essentially a non-rigid soft object: it is apt to self-occlusion, folding, and has deformation among every part (such as sleeve and tube). While producing deformation, it also influenced by light which lead to various kinds of shade at clothes, and the surface might include various kinds of pattern, texture, little piece, and decorate, these will all cause the great interference on image analysis.

Keywords: Visual Search, Visual Similarity, Garment, CBIR, Non-Rigid Matching, Shape Context

Title of the Paper: A Visual Αnalysis of Calculation-Paths of the Mandelbrot Set


Authors: Raka Jovanovic, Milan Tuba

Abstract: In this paper we present a new method for analyzing some properties of the Mandelbrot Set. The algorithm used for our visual analysis is closely connected to Pickover Stalks and Buddhabrot method. Pickover Stalks method created biomorphs, diverse and complicated forms greatly resembling invertebrate organisms. Our method extends these previously developed methods that introduce the concept of preserving information about calculation steps when calculating the Mandelbrot Set. We create images that visualize statistical information for the calculation-paths of points tested for belonging to the Mandelbrot Set. Two variations of this method are presented: one that only takes into account the paths taken and one that also uses information about their lengths. We have developed software that enables us to analyze new properties of the Mandelbrot Set that can be seen when these new display methods are used. In this paper we also present in detail important features of our software.

Keywords: Mandelbrot, Buddhabrot, Fractal, Algorithm, Pickover Stalks

Title of the Paper: Mining the Body Features to Develop Sizing Systems to Improve Business Logistics and Marketing Using Fuzzy Clustering Data Mining


Authors: Chih-Hung Hsu, Tzu-Yuan Lee, Hui-Ming Kuo

Abstract: Business logistics have played an important role of business operations of procurement, purchasing, inventory, warehousing, distribution, transportation, customer support, financial and human resources. Human body type classifications are also very crucial issue for garment manufacturing. Data mining has been widely used in many fields. But, there is lack of research in the area of establishing of garment-sizing systems for business logistics. This research aims to establish sizing systems of body types from the anthropometric data of females by using a fuzzy clustering-based data mining approach. Certain advantages may be observed when the sizing systems are established, using fuzzy clustering-based data mining procedure. Body types could be accurately classified for garment production according the newly the sizing systems. This approach is found to be effective in processing the anthropometric data, and obtaining regular rules for the development of sizing systems. The results of this study can provide an effective procedure of identifying the clusters of human body type to establish the sizing systems for integrating logistics operations internally with different functions inside the organization and also externally with business partners.

Keywords: Data mining; Fuzzy clustering; Sizing systems; Business logistics

Issue 8, Volume 8, August 2009

Title of the Paper: A Framework for Teaching Introductory Software Development


Authors: Zaigham Mahmood

Abstract: Software development (SD) refers to design and development of software applications. Most educational institutions teach programming using a procedural paradigm and an imperative language where the emphasis is often on learning a computer language and not on problem solving or the modeling of realistic computational problems. Thus, the teaching is dependent on the chosen language, which is not entirely appropriate for teaching principles of programming or SD as an engineering activity. This paper discusses the traditional method of teaching programming and suggests an objects-first approach where students adopt a top-down method of learning to develop software. Our model introduces functions and modules as basic building blocks for producing software. Thus, students' first programs are written as sequences, selections and iterations of given functions and it is in the later stages of the course, that they learn the basic constructs of the language. This paper outlines a complete framework for teaching a first course in programming. It also discusses the characteristics of a good teaching language to help academics to choose an appropriate first programming language.

Keywords: Software engineering, Software development, Programming, Computer languages, Teaching

Title of the Paper: The Web 2.0 Movement: MashUps Driven and Web Services


Authors: Chang-Chun Tsai, Cheng-Jung Lee, Shung-Ming Tang

Abstract: Service-Oriented Architecture (SOA) can be viewed as a philosophy that drives the development of components by defining their interfaces clearly and in a way that relates to real needs. It is the key to IT and business flexibility and receives a lot of attention from academia and industry as a means to develop flexible and dynamic software solutions. Web 2.0 world is wide and rich. Although significant progress is being made in several fronts, many other researchers speak of Web 2.0 applications; they tend to focus on the technology aspects of the environment. However, the real impact of integrating Web 2.0 technologies is to tie the flexibility of Web 2.0 to service-oriented principles of loose coupling, encapsulation, and reuse that are the heart and soul of SOA. Today’s sites are no longer limited to exchanging links and interacting via hypertext; instead, the interconnectedness of the Internet has become progressively more important with the rise of web services. This paper presents web 2.0 mashups remixing data and Web Services. The purpose of this paper is to propose a flexibility solution and new mashup platforms for e-applications and interactive services. Flexibility is the key driver of Web 2.0 success—the flexible delivery of data through the combination of services and disparate data sources through mash-ups, real-time data feeds, and rich interactions. In this paper we also explore the architectural basis, technologies, frameworks and tools considered necessary to face this novel vision of Web 2.0—all of which adds business value and helps companies utilizes the rich collaboration and communication of the Internet today.

Keywords: SOA, Web services, marhups, web 2.0, Service-Oriented, platform, blogs

Title of the Paper: Variable Precision Distance Search for Random Fractal Cluster Simulations


Authors: Sosa-Herrera Antonio, Rodriguez-Romo Suemi

Abstract: A simple an effective algorithm for performing distance queries between a large number of points stored in quadtrees and octrees. The algorithm is developed and tested for the construction of diffusion-limited aggregates. To achieve an enhancement on the searching time we accept approximate distance values with low precision at the first levels of the hierarchical structure, and accurate ones at the last level. The structure of the trees is the only feature used for the determination of approximate distances at any stage. These techniques allowed us to build DLA clusters with up to 109 particles for the two-dimensional case and up to 108 particles for the three-dimensional case. We also worked with the PDLA model obtaining fractal clusters with up to 1010 and 109 particles for two and three dimension clusters respectively. We worked on a supercomputer to run the PDLA simulations, as well as a high performance server for DLA simulations. We employed POSIX threads to provide parallelization and mutexes as control mechanisms to achieve synchronization between groups of 4 processors, hence simulating PDLA clusters with early convergence to the DLA model.

Keywords: DLA , PDLA, distance queries, nearest neighbor, fractal cluster, quadtree, octree

Title of the Paper: Program Recursive Forms and Programming Automatization for Functional Languages


Authors: N. Archvadze, M. Pkhovelishvili, L. Shetsiruli, M. Nizharadze

Abstract: The automatic programming system has been considered by means of which it becomes easier to carry out traditional programming stages. There is discussed both recursive forms: parallel, interrecursion and recursion of high level that exist for functional programming languages and induction methods for the purpose of their verification. The way how to present imperative languages easy and double cycles by means of recursion forms is shown, the possibility of verification has been studied for each recursion form.

Keywords: Functional Programming Languages, Recursive Forms, Programs Verification

Title of the Paper: About Flow Problems in Networks with Node Capacities


Authors: Laura Ciupala

Abstract: In this paper, we study different network flow problems in networks with node capacity. The literature on network flow problems is extensive, but these problems are described and studied in networks in which only arcs could have finite capacities. There are several applications that arise in practice and that can be reduced to a specific network flow problem in a network in which also nodes have limited capacity. This is the reason for focusing on maximum flow problem, minimum flow problem and minimum cost flow problem in a network with node capacities.

Keywords: Network flow; Network algorithms; Minimum cost flow problem Network algorithms; Maximum flow; Minimum flow; Minimum cost flow; Minimum cut

Title of the Paper: Achieving Intelligent Agents and its feasibility in Swarm-Array Computing?


Authors: Blesson Varghese, Gerard Mckee

Abstract: The work reported in this paper proposes ‘Intelligent Agents’, a Swarm-Array computing approach focused to apply autonomic computing concepts to parallel computing systems and build reliable systems for space applications. Swarm-array computing is a robotics a swarm robotics inspired novel computing approach considered as a path to achieve autonomy in parallel computing systems. In the intelligent agent approach, a task to be executed on parallel computing cores is considered as a swarm of autonomous agents. A task is carried to a computing core by carrier agents and can be seamlessly transferred between cores in the event of a predicted failure, thereby achieving self-* objectives of autonomic computing. The approach is validated on a multi-agent simulator.

Keywords: Autonomic computing, Swarm-array computing, Intelligent agents, Carrier agents

Title of the Paper: Developing Ontology Based Applications with O3L


Authors: Agostino Poggi

Abstract: Ontologies have been gaining interest and their use has been spreading in different applications fields. However, their use in the realization of applications might be further increased by the availability of more usable and efficient software library for the management of ontologies. In this paper, an object-oriented software library for the management of OWL ontologies is presented. This software library, called O3L (Object-Oriented Ontology Library), provides a complete representation of ontologies compliant with OWL 2 W3C. O3L has not the goal to be use for the creation and manipulation of ontologies, but provides a simplified and efficient API for the realization of applications, that interoperate through the use of shared ontologies, and allows: i) the use of OWL individuals as data of the applications, ii) the exchange of OWL individuals between applications, iii) the reasoning about OWL individuals, and iv) the classification of OWL classes and properties. This software library has been experimented in the realization of some e-business applications showing both high effortlessness in the development of the applications and high performances in their execution.

Keywords: OWL, object-oriented model, ontology based applications, ontology reasoning, semantic Web, Java

Title of the Paper: The Application of the Finite Element Method in the Biomechanics of the Human Upper Limb and of some Prosthetic Components


Authors: Antoanela Naaji, Daniela Gherghel

Abstract: The inclusion of analytical and experimental models in biomechanical studies leads to the obtainment of important data for the research concerning the human skeleton, its traumas and diseases. The paper showcases a number of results regarding the static and dynamic analysis of some biomechanical components by using the finite element method (FEM). Models, representing parts of the human upper limb, have been studied using static trials. Considering the fact that we wish to emphasize the way in which such analyses can be done with a finite element method, we shall present only a few relevant examples for which we have experimental data, namely: analysis of the compression, bending and stretching of the humerus and bending of the radius and the ulna. Another study was carried out in order to determine the dynamic behavior of the model for total arm prosthesis, using FEM. The models were created with the aid of the SolidWorks program and the trials with the HyperMesh program.

Keywords: FEA, static analysis, dynamic analysis, biomechanics

Title of the Paper: An Algorithm for the Guillotine Restrictions Verification in a Rectangular Covering Model


Authors: Daniela Marinescu, Alexandra Baicoianu

Abstract: We consider the Cutting and Covering problem with guillotine restrictions. In one of our previous works, we have shown the connection between the connex components of a graph representation for the covering model and the guillotine cuts. Based on this, in this paper we propose an algorithm which can be used to verify guillotine restrictions in a two-dimensional covering model.

Keywords: Bidimensional covering problem, guillotine restrictions, conex components of a graph

Title of the Paper: Improved Algorithm for Minimum Flows in Bipartite Networks with Unit Capacities


Authors: Eleonor Ciurea, Adrian Deaconu

Abstract: The theory and applications of network flows is probabily the most important single tool for applications of digraphs and perhaps even of graphs as a whole. In this paper we study minimum flow an algorithm in bipartite networks with unit capacities combining an algorithm for minimum flow in bipartite networks with an algorithm for minimum flow in unit capacity networks. Finally, we present the applications of the minimum flow problem in bipartite networks with unit capacities.

Keywords: Network flows, minimum flow problem, unit capacity networks, bipartite networks, maximum cut

Title of the Paper: Transaction-item Association Matrix-Based Frequent Pattern Network Mining Algorithm in Large-scale Transaction Database


Authors: Wei-Qing Sun, Cheng-Min Wang, Tie-Yan Zhang, Yan Zhang

Abstract: To increase the efficiency of data mining is the emphasis in this field at present. Through the establishment of transaction-item association matrix, this paper changes the process of association rule mining to elementary matrix operation, which makes the process of data mining clear and simple. Compared with algorithms like Apriori, this method avoids the demerit of traversing the database repetitiously, and increases the efficiency of association rule mining obviously in the use of sparse storage technique for large-scale matrix. To incremental type of transaction matrix, it can also make the maintainment of association rule more convenient in the use of partitioning calculation technique of matrix. On the other and, aiming at the demerits in FP-growth algorithm, this paper proposes a FP-network model which compresses the data needed in association rule mining in a FP-network. Compared with the primary FP-tree model, the FP-network proposed is undirected, which enlarge the scale of transaction storage; furthermore, the FP-network is stored through the definition of transaction-item association matrix, it is convenient to make association rule mining on the basic of defining node capability. Experiment results show that the FP-network mining association rule algorithm proposed by this paper not only inherits the merits of FP-growth algorithm, but also maintains and updates data conveniently. It improves the efficiency of association rule mining significantly.

Keywords: Association rule, association matrix, data mining, FP-growth algorithm, FP-network algorithm, frequent itemset

Title of the Paper: A Program that Acquires how to Solve Problems in Mathematics


Authors: Machiko Fujiwara, Kenzo Iwama

Abstract: In mathematics, a sequence of sentences describes how to solve a problem; for instance, sentences, “Calculate the least common multiple of 10 and 15. Firstly divide 10 and 15 by 2. Results are ‘5’ and ‘cannot divide’. ‘5’ and ‘cannot divide’ are not 1 and 1. Divide 5 and 15 by 2. Results are ‘cannot divide’ and ‘cannot divide’.” and so on, describe how to calculate the least common multiple of 10 and 15. While example sequences of sentences are given to our program, the program transforms the example sequences to generate a procedure, pG, to solve a problem. For instance, a generated procedure, pGp, checks if a given number is a prime number, another procedure, pGc, calculates the least common multiple of any two numbers, and another procedure, pGf, adds given two fractions. This paper explains our program that generates procedures, one after another, each of which solves one mathematical problem. The paper also argues, as a result of generating a procedure, pG, the meaning of a sentence (or sentences) is represented in the generated procedure, pG.

Keywords: Acquisition of how to solve problems, Generation of programs from examples, Meaning of a sentence

Title of the Paper: A Program that Acquires how to Execute Sentences


Authors: Machiko Fujiwara, Kenzo Iwama

Abstract: One writes example sequences of sentences so that one sequence solves an instance of a problem, and writes how each example runs on a computer. For instance, the one writes a sentence “From 1 to 10, repeat Body”, and also writes how the sentence Body repeats its execution on the computer. Then the one gives them to a program, pI, and lets the initial program generalize how the example sequences run and generate a procedure, pg. When the program, pI, gets a new example sequence to solve a new instance of the problem, the program, pI, executes the procedure, pG. For instance, the one writes a sentence “From 5 to 8, repeat Body”, and then the procedure, pG, repeats the sentence Body four times. As a result of generating a procedure, pG, the program, pI, acquires implicitly rules of a grammar that produce sentences. Since the generated procedures, pG’s, describe how to execute sentences of conditional branches, varying number of repetitions, and varying depth of recursive calls, this paper argues our program, pI, acquires a grammar of a language that is equivalent to that used in a conventional programming language.

Keywords: Generation of procedures from examples, Acquiring how to execute sentences from examples, Acquisition of a language grammar.

Title of the Paper: Adaptive Life-Cycle and Viability based Paramecium-Imitated Evolutionary Algorithm


Authors: Ming-Shen Jian, Ta Yuan Chou, Kun-Sian Sie, Long-Yeu Chung

Abstract: This paper proposes a sim-paramesium genetic algorithm to enhance the searching and optimizing speed of classical genetic algorithms. Based upon classical genetic algorithms, the sim-paramesium genetic algorithm employs additional operators, such as asexual reproduction, competition, and livability in the survival operation. Taking the advantages of these three operators, the searching and optimizing speed can be increased. Experiments indicate that simulations with the proposed algorithm have a 47% improvement in convergence speed on the traveling salesman problem. Also, while applying the proposed method to solve the graph coloring problem, the proposed algorithm also has a 10% improvement in solution qualities. Furthermore, since these operators are additional parts to the original GA, the algorithm can be further improved by enhancing the operators, such as selection, crossover, and mutation.

Keywords: Genetic algorithm, paramecium, traveling salesman problem, graph coloring problem, evolutionary algorithm

Title of the Paper: Design & Development of Collaborative Workflow for Lean Production in a Repair & Overhaul Industry


Authors: Teh Ying Wah, Ng Hooi Peng, Ching Sue Hok, Lai Mei Yoon

Abstract: Aerospace fuel nozzles maintenance, repair and overhaul operation is a dynamic process as the work flow is unpredictable. Further detail checking and analyzing on defects are needed in order to determine the workflow. The traditional paper based of work in progress information had been implementing as to manage the nozzles production. However, paper based of work in progress information is inefficient in dealing with these unpredictable and complex workflows. There is a probability of entering or recording the error information when it is in paper based system. This wrong information will cause problem in production line as the results are inconsistent. Furthermore, this inaccurate result will lead to difficulty in tracking the nozzles. Excessive time is needed and wasted in indentifying and correcting the errors. Hence, an electronic based of work in progress system is apparently required as to overcome the shortcoming of the paper based tracking and also to enhance the efficiency of production line. This paper will describe the configurable and customizable work in progress tracking system as a mean to improve over the traditional paper based system.

Keywords: Work Flow, Configurable, Tracking System

Title of the Paper: Some Considerations About Collaborative Systems Supporting Knowledge Management in Organizations


Authors: Mihaela I. Muntean, Diana Tarnaveanu

Abstract: In the present global economy, strongly influenced by IT (information technology) and information systems evolution, the modern organizations try to face the challenges by adjusting their strategies and restructuring their activities, for aligning them to the new economy requirements. It is certain, that the enterprise’s performance will depend on the capacity to sustain collaborative work. The evolution of information systems in these collaborative environments led to the sudden necessity to adopt, for maintaining the virtual activities/processes, the latest technologies/systems that are capable to support integrated collaboration in business services. Stating this, we mean collaborative systems of different type: conversational tools, multi-agent systems, and all these, among various enterprise applications, integrated in portal-based IT platforms. It is obvious, that all collaborative environments (workgroups, practice communities, collaborative enterprises) are based on knowledge, and between collaboration and knowledge management (KM) there is a strong interdependence. Therefore, we focused on how collaborative systems are capable to sustain knowledge management and their impact on optimizing the KM life cycle. We explore some issues regarding collaborative systems and propose a portal-based IT solution that sustains the KM life cycle through a distributed architecture. All considerations have a strong research background, our portal-based proposal for sustaining knowledge management in organizations being subject of some Romanian research projects that are fitting in the European research demarches.

Keywords: Collaboration, collaborative environments, knowledge management, distributed knowledge management, collaborative systems, portals, knowledge portals.

Title of the Paper: A Multiagent Grid Metascheduler


Authors: Jose Aguilar, Rodolfo Sumoza

Abstract: This work proposes a Metascheduler for GRID platforms based on the Interaction Protocols of the Multiagent Systems. These protocols use the paradigm of economic models to define the coordination mechanisms in agent communities. Specifically, in this work we use auction and tender economical models. We propose an adaptive Metascheduler for GRID platforms using these ideas. According to the number of available resources one of these models is used to coordinate the assignment of resources.

Keywords: GRID, Metascheduler, Multiagents Systems, Auction, Tender, Assignment problem

Title of the Paper: A Security Incidents Management for a CERT based on Swarm Intelligence


Authors: Jose Aguilar, Blanca Abraham, Gilberto Moreno

Abstract: This paper proposes an incident management system based on the swarm intelligence in order to keep updated the information about computational security. Particularly, swarm intelligence is an extension of the multiagents theory, where reactive agents follow very simple rules. We propose a search and selection method based on swarm intelligence, where the agents search previous answers to incidents of security on Internet.

Keywords: Incident Management System, Swarm Intelligence, Security Systems, CERT, Search Algorithm, Multiagent Systems

Title of the Paper: Case-Based Reasoning and Fuzzy Logic in Fault Diagnosis


Authors: Viriato Marques, J. Torres Farinha, Antonio Brito

Abstract: This paper is divided into four parts: the first one introduces SADEX, a fuzzy Case Based Reasoning (CBR) System for fault diagnosis. The second focus on its observation relevance factors and shows how the results are in complete agreement with the relevance concept introduced by Robertson and Spark-Jones in their well known and proved technique for document retrieval. The third describes how equipment composition information can be used to generalize and adapt case solutions to new and unknown occurrences; this generalization is based on a taxonomic similarity between functionally autonomous modules (FAMs). Finally the MKM - Maintenance Knowledge Manager system is introduced.

Keywords: Case-based Reasoning, Fuzzy Systems, Relevance, Taxonomies, Knowledge Management

Issue 9, Volume 8, September 2009

Title of the Paper: Crossing Genetic and Swarm Intelligence Algorithms to Generate Logic Circuits


Authors: Ceciia Reis, J. A. Tenreiro Machado

Abstract: Genetic Algorithms (GAs) are adaptive heuristic search algorithm based on the evolutionary ideas of natural selection and genetic. The basic concept of GAs is designed to simulate processes in natural system necessary for evolution, specifically those that follow the principles first laid down by Charles Darwin of survival of the fittest. On the other hand, Particle swarm optimization (PSO) is a population based stochastic optimization technique inspired by social behavior of bird flocking or fish schooling. PSO shares many similarities with evolutionary computation techniques such as GAs. The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles. PSO is attractive because there are few parameters to adjust. This paper presents hybridization between a GA algorithm and a PSO algorithm (crossing the two algorithms). The resulting algorithm is applied to the synthesis of combinational logic circuits. With this combination is possible to take advantage of the best features of each particular algorithm.

Keywords: Artificial Intelligence, Computational Intelligence, Evolutionary Computation, Genetic Algorithms, Particle Swarm Optimization, Digital Circuits

Title of the Paper: Prediction of Disulfide Bonding Pattern Based on Support Vector Machine with Parameters Tuned by Multiple Trajectory Search


Authors: Hsuan-Hung Lin, Lin-Yu Tseng

Abstract: The prediction of the location of disulfide bridges helps solving the protein folding problem. Most of previous works on disulfide connectivity pattern prediction use the prior knowledge of the bonding state of cysteines. In this study an effective method is proposed to predict disulfide connectivity pattern without the prior knowledge of cysteins’bonding state. To the best of our knowledge, without the prior knowledge of the bonding state of cysteines, the best accuracy rate reported in the literature for the prediction of the overall disulfide connectivity pattern (Qp) and that of disulfide bridge prediction (Qc) are 48% and 51% respectively for the dataset SPX. In this study, the cystein position difference, the cystein index difference, the predicted secondary structure of protein and the PSSM score are used as features. The support vector machine (SVM) is trained to compute the connectivity probabilities of cysteine pairs. An evolutionary algorithm called the multiple trajectory search (MTS) is integrated with the SVM training to tune the parameters for the SVM and the window sizes for the predicted secondary structure and the PSSM. The maximum weight perfect matching algorithm is then used to find the disulfide connectivity pattern. Testing our method on the same dataset SPX, the accuracy rates are 54.5% and 60% for disulfide connectivity pattern prediction and disulfide bridge prediction when the bonding state of cysteines is not known in advance.

Keywords: Disulfide bonding pattern, SVM, multiple trajectory search

Title of the Paper: Adapting the Ticket Request System to the Needs of CSIRT Teams


Authors: Pavel Kacha

Abstract: CSIRTs (Computer Security Response Teams) are the natural response to the widespread internet threats. Many of them have grown of small, but focused groups of people, by streamlining and expanding of what they have been already doing as part of their IT administrative work. Formalisation of the procedures and workflows brings the need for specialised tools, helping with incident categorisation, authorisation of incident origin and general workflow. Also, special nature of incoming report emails introduces a new issues to otherwise well-known spam and backscatter fighting methods. As well as low level know-how, important part of security team practices are also higher level statistical analyses for pinpointing potential threats and trends. This paper proposes approaches to these problems and describes their implementation as modifications and supportive applications for Open Ticket Request System (OTRS), as well as experience from usage in the real world medium-sized security team.

Keywords: OTRS, CSIRT, CERT, security incident, metadata, issue management, Bayesian analysis, antispam, backscatter, statistics

Title of the Paper: Audit System at CESNET-CERTS


Authors: Pavel Vachek

Abstract: CESNET-CERTS, the Computer Security Incident Response Team of the CESNET Association in Prague, Czech Republic, uses several security tools based on freely licensed programs. One of them is an enhanced system for host security auditing which is based on Nessus and runs on a PC server under Linux. An e-mail interface developed in-house allows users to perform basic host security audits simply and securely without having to study the extensive Nessus manuals and/or installing the Nessus server. The use of a similar open-source program OpenVAS within the Audit System is also considered.

Keywords: Cesnet Association, Host security audit, Nessus, OpenVas, Pc, Linux, E-mail interface

Title of the Paper: A New Method for Clustering Heterogeneous Data: Clustering by Compression


Authors: Dorin Carstoiu, Alexandra Cernian, Valentin Sgarciu, Adriana Olteanu

Abstract: Nowadays, we have to deal with a large quantity of unstructured data, produced by a number of sources. For example, clustering web pages is essential to getting structured information in response to user queries. In this paper, we intend to test the results of a new clustering technique – clustering by compression – when applied to heterogeneous sets of data. The clustering by compression procedure is based on a parameterfree, universal, similarity distance, the normalized compression distance or NCD, computed from the lengths of compressed data files (singly and in pair-wise concatenation). Compression algorithms allow defining a similarity measure based on the degree of common information, whereas clustering methods allow clustering similar data without any previous knowledge.

Keywords: Custering, heterogeneous data, clustering by compression, Normalized Compression Distance (NCD), FScore

Title of the Paper: Effectiveness and Accuracy of Wireless Positioning Systems


Authors: Sebastian Fuicu, Marius Marcu, Bogdan Stratulat, Anania Girban

Abstract: Localization or positioning is an important aspect of mobile applications in order to achieve contextaware applications. The main goal for localization process is estimating the position of a mobile device in its environment based on a set of sensors with known positions. Modern mobile devices produced nowadays, and almost all smartphones and PDAs, have one or more wireless communication interfaces in order to communicate with other devices. WLAN infrastructure is a widely accepted and implemented communication standard in many indoor environment, therefore many buildings are already equipped with IEEE 802.11 WLAN infrastructure access points. Wireless adapters the modern mobile devices are equipped with can monitor radio power strength of the emitting sources nearby. Based on the received radio signal strength an estimation of distance between the device and power source can be computed. In this paper we want to describe and conclude our work regarding the possibility to design a simple and energy efficient positioning solution to be implemented on mobile devices with limited resources.

Keywords: Wireless positioning systems, radio signal strength, energy efficiency, trilateration

Title of the Paper: Application of ε-Testers Algorithms under Sketch and Streaming Calculation Model in Robot Navigation


Authors: Carlos Rodriguez Lucatero

Abstract: The goal of this article is the application of efficient approximate testing methods for the robot tracking problem as well as for the building map problem . In the Databases field, one wish to classify uncertain data and to answer queries in an approximated way in the case that data sources are incoherent. Our interest is based on the fundamental problems in the randomized approximate testing algorithms. Concerning robot navigation we would like to apply efficient data sketching algorithms for testing the automata inferred on robot motion tracking problems as well as on maps sensorialy generated by robot exploration and building map process. In the case of the information generated by sensors, given that we have no complete access to the information, and given the inherent limitations on memory space, we will apply streaming algorithms for approximate efficiently the solution of computational problems that take place on robot navigation problems.

Keywords: Robot navigation, Motion tracking, Exploration, Sketching and streaming algorithms, Testability

Title of the Paper: Three Different Designs for Packet Classification


Authors: Hatam Abdoli

Abstract: If we analyze real life filter sets (classifiers) and also packet classification requirements, it seems that distribution of rule scope is non-uniform and in some sub spaces is denser inside the total space of classifiers. These features guided us to add "cut point heuristic" to HiCuts, one of the most efficient algorithms and resulted in two new optimized designs for HiCuts, named B-HiCuts and Hist. Also one of the hardware based and fast solution for classification is use of TCAM memory to implement classifiers, but TCAMs are expensive and are not efficient for range fields. The third approach in this paper introduces a hybrid scheme to do a part of search on prefix fields in TCAM and then to check other fields, the search process will be traversed to RAM section in the second level. Synthetic classifiers are made by ClassBench and used to simulate and evaluate performance of proposed designs. The most specifications of proposed methods are balancing of decision trees and reducing the consumed memory for B-HiCuts and Hist and also solving range to prefix conversion and multi match classification problems in the last proposed design.

Keywords: Balanced tree, HiCuts, Heuristic, Packet classification, Packet filter, Router, TCAM

Title of the Paper: Empirical Determination of Sample Sizes for Multi-layer Perceptrons by Simple RBF Networks


Authors: Hyontai Sug

Abstract: It’s well known that the computing time to train multilayer perceptrons is very long because of weight space of the neural networks and small amount of adjustment of the wiights for convergence. The matter becomes worse when the size of training data set is large, which is common in data mining tasks. Moreover, depending on samples, the performance of neural networks change. So, in order to determine appropriate sample sizes for multilayer perceptrons this paper suggests an effective approach with the help of simple radial basis function networks that work as a guide. Experiments with the two different data sets that may represent business and scientific domain well showed the effectiveness of the suggested method.

Keywords: Multilayer perceptron, sample size, radial basis function network, data mining

Title of the Paper: Sensitivity Analysis of Hopfield Neural Network in Classifying Natural RGB Color Space


Authors: Rachid Sammouda

Abstract: This paper presents a study of the sensitivity analysis of the artificial Hopfield Neural Network (HNN) when segmenting natural color images. The color distinction or vision system relies on two step process which, first classifies the different regions in the scene into a given number of clusters, and then assigns to each cluster a color that is likely to one of its corresponding region in the raw image. The classification process is performed using the minimization of an energy function typically the Sum of Squared Errors (SSE). The optimization process is found sensitive to the step taken by the network in its way to the global minimum. The color assignment to the clusters is performed based on combination of information from the color palette used in the raw image and the last distribution of the pixels among clusters. Applying the system to a gold standard color image, the results show that HNN natural color segmentation accuracy can be significantly improved if we control its step size when modifying its weights between its neurons each iteration. The color matching process shows a lot of consistency when tested with natural color images as shown in the results presented here.

Keywords: Hopfield Neural Network, Sensitivity analysis, Segmentation, Natural Color Image matching, RGB Color Space

Title of the Paper: An Advanced Hybrid Machine Learning Approach for Assessment of the Change of Gait Symmetry


Authors: Jianning Wu

Abstract: The quantitative assessment of the change of gait symmetry has played a very important role in the clinical diagnostics. This paper investigated the application of an advanced hybrid machine learning approach such as the combining kernel-based principal component analysis (KPCA) with support vector machine (SVM) to evaluate the change of gait symmetry quantitatively based on the basic idea that the discrimination of the functional change of between human lower extremities can be hypothesized as binary classification task. To assess the change of gait symmetry accurately, more nonlinear principal components extracted by using KPCA were employed to initiate the training set of SVM, which could enhance the generalization performance of SVM. The foot-ground force gait data of 24 elderly participants were acquired using a strain gauge force platform during normal walking, and were analyzed with our proposed model. The test results demonstrated that , when compared to the SVM-based classification models, our proposed technique with superior classification performance could discriminate difference between the right and left side gait function of lower limbs accurately, and that more principal components extracted by KPCA with polynomial kernel (=3) could capture more useful information about intrinsic nonlinear dynamics of human gait in comparison to the some key gait variables selected. The proposed hybrid model could function as an effective tool for clinical diagnostics in the future clinical circumstance.

Keywords: Gait analysis, Gait symmetry, Kernel principal component analysis, Support vector machine, Gait classification, Kinetic gait data

Title of the Paper: Model Validation for GPS Total Electron Content (TEC) using 10th Polynomial Function Technique at an Equatorial Region


Authors: Norsuzila Ya'Acob, Mardina Abdullah, Mahamod Ismail, Azami Zaharim

Abstract: GPS receivers have been profitably employed by researchers for investigations into ionospheric and atmospheric science. However, a number of improvements in measurement accuracy are necessary for today’s applications. The ionosphere has practical importance in GPS applications because it influences transionospheric radio wave propagation. Total Electron Content (TEC) is one of the parameters of the ionosphere that produces the most effects in many radio applications such as radio communications, navigation and space weather. Delays in GPS signals affect the accuracy of GPS positioning. The determination of the TEC will aid in reliable space-based navigation system. By modelling this TEC parameter, the evaluation of the ionospheric error and the correction of these ionospheric errors for differential GPS can be done. Determination of the differential ionospheric error to sub-centimetre accuracy is described in this paper utilizing a developed model. An ionospheric delay model was developed to accurately determine the difference in ionospheric delay expected over a short baseline so that a more accurate differential GPS correction could be made. An ionospheric error correction model should be made applicable to any location including the equatorial region. The results showed that the developed algorithm is a function of elevation angle and TEC from the reference station path to the satellite and could give differential ionospheric delay in sub-centimetre accuracy.

Keywords: GPS, TEC, ionosphere, baseline, differential GPS, transionospheric

Title of the Paper: Filtering vs. Nonlinear Estimation Procedures for Image Enhancement


Authors: Barbara Dzaja, Mirjana Bonkovic, Spomenka Bovan

Abstract: Image enhancement methods can be divided into two groups: the ones that use only one single image and the ones that rely on specific training set or use multiple images. In this paper an iterative algorithm, based on the quasi-Newton methods, is introduced with the objective to enhance resolution only by one single image. In the paper there will be compared results gained depending on the used method: nonlinear iterative algorithm vs. filtering algorithm with EMD. Empirical Mode Decomposition can be treated as filtering procedure for image enhancement.

Keywords: Image enhancement, Broyden algorithm, Iterative algorithms, Empirical Mode Decomposition

Title of the Paper: Inverse Dynamic Compound Control for Intelligent Artificial Leg Based on PD-CMAC


Authors: Hong-Liu Yu, Zhao-Hong Xu, Xing-San Qian, Zhan Zhao, Ling Shen

Abstract: Traditional mathematic model is not suitable for actual control, because the knee torque of intelligent artificial leg (IAL) is indirectly caused by the nonlinear damping. An inverse dynamic compound control for intelligent artificial leg was studied. A dynamics model of hydraulic IAL with the nonlinear damper control parameters and hip torque was set up, and an inverse dynamic compound controller based on PD-CMAC for tracking the knee swing was designed. The simulation results show that an arbitrary trajectory such as a desired walking pattern can be tracked in less than 0.5 seconds, which proves that the controller meets real time and precision demanded.

Keywords: Intelligent artificial leg, compound control, dynamic model, cerebellar model articulation controller

Title of the Paper: Numerical Simulation of Bridge Damage under Blast Loads


Authors: Rong-Bing Deng, Xian-Long Jin

Abstract: The numerical simulation of the structural damage of a steel truss bridge subjected to blast loading with the aid of a hydrocode is presented in this paper. A three-dimensional nonlinear finite element model of an actual bridge has been developed based on the drawing design of the Minpu II Bridge in Shanghai. The effects of mesh size on pressure distribution produced by explosions are also studied. Through the comparison between the calculation results and the experimental values, the reliability of the calculation is validated. All the process from the detonation of the explosive charge to deck crack, including the propagation of the blast wave and its interaction with the structure is reproduced. The numerical results show the damage of bridge parts and provide a global understanding of bridge under blast loads. It may be generated to supplement experimental studies for developing appropriate blast-resistant design guidelines for bridges in the future.

Keywords: Steel truss bridge, blast load, hydrocode, bridge damage, ALE method, numerical simulation

Title of the Paper: Texture Defect Detection System with Image Deflection Compensation


Authors: Chun-Cheng Lin, Cheng-Yu Yeh

Abstract: Image textural analysis technology has been widely used in the design of automated defect detection systems. Because the presence of defects may change the textural features of an image, a reference image without defects can be compared with the test image to detect whether there are any defects. However, besides defects, the deflection of the input test image could also change its textural features. When there is any angular difference between the reference and test images, their textural features would also be different, even if there is no defect in the test image. As a result, misjudgment of the defect detection system may occur. Most of the previous studies have focused on the development of textural analysis technology which could decrease the effect of test image deflection. This study aimed to estimate the deflection angle of test images through polar Fourier transform and phase correlation analysis, and rotate the reference image by the same angle to compensate for the deflection of the test image. After the angles of the reference and test images were brought into line, the textural analysis based on the gray level co-occurrence matrix was applied to analyze and compare the textural features of the two images. The results of actual texture defect detection demonstrated that the angular differences between the reference and test images could be estimated correctly, with an estimation error of only 0° to 0.5°. By compensating for the deflection of the test image, the accuracy of the texture defect detection could be effectively enhanced.

Keywords: Texture defect detection, Image deflection compensation, Polar Fourier transform, Phase correlation analysis, Gray level co-occurrence matrix

Title of the Paper: Fault-Tolerant Mapping of a Mesh Network in a Flexible Hypercube


Authors: Jen-Chih Lin

Abstract: The Flexible Hypercube FHN is an important variant of the Hypercube Hn and possesses many desirable properties for interconnection networks. This paper proposes a novel algorithm of fault-tolerant method for mapping a mesh network in a Flexible Hypercube. The main results obtained (1) a searching path of a FHN is including approximate to (n+1) nodes, where n=2logN. Therefore, there are O() faults, which can be tolerated. (2) Our results for the mapping methods are optimized mainly for balancing the processor and communication link loads. These results mean that the parallel algorithms developed by the structure of mesh network can be executed in a faulty FHN. The useful properties revealed and the algorithm proposed in this paper can find their way when the system designers evaluate a candidate network’s competence and suitability, balancing regularity and other performance criteria, in choosing an interconnection network.

Keywords: Hypercube, Flexible Hypercube, Mesh Network, Fault-Tolerant, embedding

Title of the Paper: Τhe Function Block Model in Embedded Control and Automation from IEC61131 to IEC61499


Authors: Kleanthis Thramboulidis

Abstract: The Function Block (FB) model was first standardized by the 1131 standard of the International Electrotechnical Commission (IEC) for programmable controllers. This standard was successfully adopted by the industry but it seems to have several constraints for the development of today’s complex embedded control and automation systems. These constraints are mainly imposed by the procedural programming paradigm and the device centric approach that are adopted by the standard. The IEC to address these constraints proposed the 61499 standard that is an attempt to exploit object-orientation and the application-centric paradigm in the control and automation domain. In this paper, the FB models of 1131 and 61499 are briefly described and several unclear issues related to the programming paradigms adopted, interoperability, composability and execution semantics of these FB models are clarified. The paper focuses on the execution semantics of the 61499 standard since this is one of the most important reasons that the industry has not yet accepted this standard.

Keywords: Embedded control and automation systems, IEC 61131, IEC 61499, 1131 Function Block Model, IEC61499 execution environment, execution model semantics, Factory Automation

Title of the Paper: An Environment for Describing Software Systems


Authors: Adel Smeda, Adel Alti, Abbdellah Boukerram

Abstract: Describing the architecture of complex software systems need a comprehensive models and complete tools. The description of software systems can be achieved by using an architecture description language (ADL) or an object oriented modeling language. In this article, we show how we can build a hybrid model to describe the architecture of software systems. This model is based on the two approaches. First we define a metamodel for software architecture, next based on this metamodel we implement an environment for describing the architecture of software systems.

Keywords: Software Architecture, COSA, Architecture Description Languages, UML 2.0 Modeling Language, Component, Connector

Issue 10, Volume 8, October 2009

Title of the Paper: Opportunities in ICT Education


Authors: Seyed Shahrestani

Abstract: While cost saving is at the forefront of the reasons for offshoring to low wage countries, the moves relevant to ICT are also motivated by difficulty in finding the right talents inside the country. One of the root causes for such a difficulty, is related to the drastic fall in the number of students in field like computer engineering and ICT. To combat that, there have been serious changes in national education policies, and the way universities and other training institutions conduct their business to inspire young students to choose ICT for their studies. Although as a consequence of those, in some parts of the world, the number of students enrolling in these fields have stabilized or even increased, given the number of years it takes to educate a graduate, the number of graduates has been dropping at alarming rates. Furthermore, the ICT skills shortages for experienced professionals, in most industrialized countries can be expected to get worse, before they eventually get better. There is also a strong case for retraining many people who already have tertiary education, whether in the workforce or not, to overcome to ominous ICT skills dilemma. This paper reports on the examination of these problems. It also reports on the advantages of taking a more broad-spectrum view, requiring a combination of many existing solutions along with novel approaches and realistic analysis of the acceptance of the current global ICT services and education environments to overcome these problems.

Keywords: ICT Education, Higher Education, Offshoring, Skills Shortage

Title of the Paper: A Closer Look to the V-Model Approach for Role Engineering


Authors: Radu Constantinescu, Andrei Toma

Abstract: Role engineering is a both necessary and critical topic in the development of Role Based Access Control system, which seems to be the most proficient access control approach nowadays. Even if the maturity of the RBAC model has been already achieved, the role engineering process is not a standardized approach. The paper aims to illustrate an enhanced process model for role engineering. The model is focused on the intuitive discovering of the roles and their assignment with permissions, using a test-driven approach.

Keywords: Access control systems, role based access control systems – RBAC, role engineering, authorization, roles, permissions, constraints, role hierarchies.

Title of the Paper: Improving Organizational Efficiency and Effectiveness in a Romanian Higher Education Institution


Authors: Scorta Iuliana, Bara Adela, Constantinescu Radu, Zota Razvan, Nastase Floarea

Abstract: In response to the significant changes in the competitive climate of public sector higher education institutions (HEIs), the Romanian universities face the need to improve operational efficiency by implementing advanced information systems. The purpose of this paper is to provide a solution for an ERP (Enterprise Resource Planning) system that could integrate all of a Romanian HEI’s department functionalities, to propose a model that describes how the concepts of user competence and role may optimize an ERP system implementation in a Romanian HEI and to provide a possible architectural solution for developing a decision support system that can extract and report the information from the proposed ERP system.

Keywords: ERP, HEI, DSS, Competencies, Roles, Knowledge

Title of the Paper: New Families of Computation-Efficient Parallel Prefix Algorithms


Authors: Yen-Chun Lin, Li-Ling Hung

Abstract: New families of computation-efficient parallel prefix algorithms for message-passing multicomputers are presented. The first family improves the communication time of a previous family of parallel prefix algorithms; both use only half-duplex communications. Two other families adopt collective communication operations to reduce the communication times of the former two, respectively. The precondition of the presented algorithms is also given. These families each provide the flexibility of either fewer computation time steps or fewer communication time steps to achieve the minimal running time depending on the ratio of the time required by a communication step to the time required by a computation step. Relative merits and drawbacks of parallel prefix algorithms are described and illustrated to provide insights into when and why the presented algorithms can be best used.

Keywords: Collective communication, Computation-efficient parallel prefix, Half-duplex communication, Message-passing multicomputer, Parallel algorithm, Precondition, Prefix computation

Title of the Paper: New Computation of Normal Vector and Curvature


Authors: Hyoung-Seok Kim, Ho-Sook Kim

Abstract: The local geometric properties such as curvatures and normal vectors play important roles in analyzing the local shape of objects. The result of the geometric operations such as mesh simplification and mesh smoothing is dependent on how to compute the normal vectors and the curvatures of vertices, because there are no exact definitions of the normal vector and the discrete curvature in meshes. Therefore, the discrete curvature and normal vector estimation play the fundamental roles in the fields of computer graphics and computer vision. In this paper, we propose new methods for computing normal vector and curvature well, which are more intuitive than the previous methods. Our normal vector computation algorithm is able to compute the normal vectors more accurately and is available to meshes of arbitrary topology. It is due to the properties of local conformal mapping and the mean value coordinates. Secondly, we point out the fatal error of the previous discrete curvature estimations, and then propose a new discrete sectional-curvature estimation to be able to overcome the error. The method is based on the parabola interpolation and the geometric properties of Bezier curve. It is confirmed by experiment that the normal vector and the curvature generated by our algorithm are more accurate than that of the previous methods.

Keywords: Normal vector, curvature, local geometric property, mesh segmentation

Title of the Paper: Auction Resource Allocation Mechanisms in Grids of Heterogeneous Computers


Authors: Timothy M. Lynar, Ric D. Herbert, Simon

Abstract: This paper examines economic resource allocation through a number of auction types for a grid of ewaste computers. It examines the time to complete tasks and the energy usage of completing the tasks on a grid. A model of a simulated grid is developed and used to evaluate the resource allocation mechanisms. The model is an agent-based simulation where by user agents submit tasks to node agents that process these tasks. We evaluate three types of resource-allocator agents which all use a type of auction. The auction types are batch auction, continuous double auction and a pre-processed batch auction. The pre-processed batch auction is developed to try to have the advantages of both the continuous double auction and the batch auction. The simulated grid is calibrated to a real e-waste grid where each node has a performance index. This grid is a test grid of eight nodes of heterogenous computer hardware and with differing computational ability and energy usage. We simulate the auction types under the same task input streams. We consider a task impulse response stream on energy usage and time to complete all tasks and a input stream step response. Finally we consider the three auction allocation mechanisms under a random task stream. The paper finds that the choice of auction method makes a substantial difference in the time to complete tasks and in total energy consumption.

Keywords: Grid computing, resource allocation, auctions, e-waste, energy consumption

Title of the Paper: Improving the ETL Process and Maintenance of Higher Education Information System Data Warehouse


Authors: Igor Mekterovic, Ljiljana Brkic, Mirta Baranovic

Abstract: HEIS (Higher Education Information System) is a project funded by the Croatian Ministry of Science, Education and Sports started in the year 2001. HEIS is a comprehensive information system that provides support for education related processes taking place within a higher education institution. As a part of the project, a data warehouse was developed to provide reporting and analytical features. This paper presents the HEIS data warehouse architecture, comments on the data model and addresses issues (and our solutions) that arose during the seven-year development and maintenance period. Foremost, we address improvements in the ETL process and maintenance process.

Keywords: Data warehouse, Higher education information system, ETL, Maintenance, Dimensional model

Title of the Paper: Traceability-based Incremental Model Synchronization


Authors: Istvan Madari, Laszlo Angyal, Laszlo Lengyel

Abstract: Model transformation is a crucial aspect of Model-Driven Software Development. With the help of model transformation, we can generate source code or other artifacts from software models. However, a recurring problem in software development is the fact that source and target models coexist and they evolve independently. In general, a modeled system is composed of several models that are often related to one another. Consequently, the related models will not be consistent anymore if one of them is altered in the development process. For that reason, a model synchronization method is necessiated to resolve inconsistency between the modified models. Performing synchronization manually can be an error prone task due to the number and complexity of model elements. In model-driven technologies, where processing is carried out as a series of model transformations, applying model transformations can also be a reasonable option for the reconciliation. This paper presents an approach that uses trace models and model transformations to facilitate incremental model synchronization.

Keywords: Model transformation, Model synchronization, Traceability, Trace mode

Title of the Paper: Semantic Processing based on Eye-Tracking Metrics


Authors: Robert Andrei Buchmann, Alin Mihaila, Radu Meza

Abstract: This paper proposes a framework for capturing semantics from eye tracking data during the process of text skimming/scanning by readers of electronic documents and HTML user interfaces. RDFa and HTML microformats are some of the easier ways proposed by the Semantic Web paradigm for embedding semantics in web pages. XSLT transformations or specialized parsers may easily convert such documents to RDF/XML semantic repositories. However, semantics do not usually have an absolute character. Although a variety of web 2.0 oriented ontologies and microformats have been widely adopted or even standardized (Dublin Core, FOAF, XFN etc.), in order to achieve semantic interoperability, there are scenarios in which user-relative semantics are especially important, such as in the development of customization engines (web session customization, recommender systems, targeted advertising), when a certain user must only share semantics with himself or with similar persons. Having the same web document, different readers would attach various semantics and relevance to the ideas, concepts or structural blocks of the document. Eye tracking is an emerging field with multiple applications in medicine, marketing, cognitive sciences and others, which allows the extraction of data regarding the eye activity of a user during human-computer interaction. Eye tracking data is valuable in measuring reading patterns, user experience and reflects the specific parts of an image or documents that attract the user’s interest.

Keywords: Eye tracking, semantics, text skimming, microformats

Title of the Paper: Improved Performance Model for Web-Based Software Systems


Authors: Agnes Bogardi-Meszoly, Tihamer Levendovszky, Hassan Charaf

Abstract: The performance of web-based software systems is one of the most important and complicated consideration. With the help of a proper performance model and an appropriate evaluation algorithm, performance metrics can be predicted. The goal of our work is to introduce and verify an improved multi-tier queueing network model for web-based software systems. In our work, an evaluation and prediction technique applying dominant factors in point of response time and throughput performance metrics has been established and investigated. The Mean-Value Analysis algorithm has been improved to model the behavior of the thread pool. The proposed algorithm can be used for performance prediction. The convergence and limit of the algorithm have been analyzed. The validity of the proposed algorithm and the correctness of the performance prediction have been proven with performance measurements.

Keywords: Web-based software systems, Queueing model, Mean-Value Analysis, Thread pool, Performance prediction, Convergence

Issue 11, Volume 8, November 2009

Title of the Paper: A Flexible Implementation of a Web-based Election System for Educational Organizations


Authors: Sharil Tumin, Sylvia Encheva

Abstract: Web-based on-line voting and on-line election systems provide benefits of usability, manageability and security. A particular workflow in any phase of on-line election process can be modeled and implemented securely by employing basic security applications readily provided by well established cryptographic technologies. By analyzing data-flow between di erent phases in the workflow, secure processes can be implemented using Web, database and cryptographic techniques. The implementation has to deliver a system that provides performance properties mandatory of an on-line election system of authentication, democracy, anonymity, non-coercion, accuracy, reliability, veracity, verifiability, neutrality, and likability.

Keywords: e-Voting, e-Election, Applications security, Multi-tiers Web-based application, Secure Workflow Modeling, Practical Cryptographic Applications

Title of the Paper: A Utility based, Multi-Attribute Negotiation Approach for Semantic Web Services


Authors: Sandeep Kumar, Nikos E. Mastorakis

Abstract: Apart from some other important Semantic Web service related processes such as discovery, selection, composition etc., the process of negotiation is also generally required in the semantic web based systems. Before taking the services of a service provider, the service requester may need to negotiate with it on various issues. A utility-based, multi-attribute negotiation approach capable of providing negotiation between participating semantic web services has been presented in this paper. The approach is based upon the use of utility functions in the negotiation process and uses multiple attributes as the basis of negotiation. A communication model describing the negotiation process has been presented. The paper also presents the algorithms for various activities involved in the negotiation process. The work also proposes a novel concept of negotiation-feedback using a novel data-structure, Agreement Table. This concept can be helpful in expediting the negotiation process by decreasing the number of negotiation steps in which the agreement is reached. An evaluation of the work has been presented and a prototype system providing negotiation between semantic web services has been implemented.

Keywords: Semantic web, utility, semantic web service, negotiation

Title of the Paper: A Online Retrieving Method for Product Functional and Structural Information based FGT Model


Authors: Feng Qi

Abstract: In reusable design, the reusability of product information model is a key criterion for design efficiency. In order to improve the reusability for product information model, a online retrieving method of product functional and structural information has been discussed, based on the feature graph-tree model. In the model, a structural Graph-Tree has been constituted by assembly feature as a basic unit. Furthermore, a notation system has been presented for representation of the model. Based on the proposal feature mapping arithmetic, the assembly model is been built easily. Mapping between functions and structures has been realized easily and the instance of closed loop in constrain resolve has been avoided successfully. Finally, the integration of conceptual design and detail design has been well realized.

Keywords: Assembly model, Feature, reusable design, CAD

Title of the Paper: Alghorithm for Map Color


Authors: Marius-Constantin Popescu, Liliana Popescu, Nikos Mastorakis

Abstract: In this paper are followed the necessary steps for the realisation of the map’s coloring, matter that stoud in the attention of many mathematicians for a long time. It is debated the matter of the four colors, but also the way of solving by implementing of an algorithm in the MAP-MAN application. Also, it is tackled the maps drawing in real time within GPS system satellites, using more colors depending on the covered route and the landforms met.

Keywords: The issue of the fourth colors, The MAP-MAN application, Software for GPS

Title of the Paper: Scoring Functions of Approximation of STRIPS Planning by Linear Programming – Block World Example


Authors: Adam Galuszka

Abstract: STRIPS planning is a problem of finding of a set of actions that transform given initial state to desired goal situation. It is hard computational problem. In this work an approximation of STRIPS block world planning by linear programming is shown. The cost of such approach is that algorithm can results in non-interpretable solutions for some initial states (what is followed by assumption P ? NP). This is because the discrete domain (truth or false) is transformed to continuous domain (LP program). Additionally two scoring functions have been introduced to estimate quality of the plan. Proposed approach is illustrated by exemplary simulation.

Keywords: STRIPS planning, Block world, computational efficiency, linear programming

Title of the Paper: Hierarchical Clustering of Distributed Object-Oriented Software Systems: A Generic Solution for Software-Hardware Mismatch Problem


Authors: Amal Abd El-Raouf

Abstract: During the software lifecycle, the software structure is subject to many changes in order to fulfill the customer’s requirements. In Distributed Object Oriented systems, software engineers face many challenges to solve the software-hardware mismatch problem in which the software structure does not match the customer’s underlying hardware. A major design problem of Object Oriented software systems is the efficient distribution of software classes among the different nodes in the system while maintaining two features: low-coupling and high software quality. In this paper, we present a new methodology for efficiently restructuring Distributed Object Oriented software systems to improve the overall system performance and to solve the software-hardware mismatch problem. Our method has two main phases. In the first phase, we use the hierarchical clustering method to restructure the target software application. As a result, all the possible clustering solutions that could be applied to the target software application are generated. In the second phase, we decide on the best-fit clustering solution according to the customer hardware organization.

Keywords: Software restructuring, hierarchical clustering, distributed systems, object oriented software, performance analysis, low coupling

Title of the Paper: An Evaluation Framework for Higher Education ERP Systems


Authors: Gheorghe Sabau, Mihaela Munten, Ana-Ramona Bologa, Razvan Bologa, Traian Surcel

Abstract: A Higher Education ERP system can be used as a solution to integrate and increase the efficiency of the Romanian university processes. This paper examines the application of ERP software in Romanian Universities. We made an SWOT analysis for implementing an ERP system in Romanian Universities. Also, we proposed a comparison framework of ERP solutions for higher education management using as starting point the requirements of a Romanian University. The framework was applied to four of the top Higher Education management solutions. The process of evaluating an ERP system for use in higher education requires that a university compare the existing ERP software available to their current processes and see which ERP solution will best fit the existing procedures.

Keywords: Higher Education, University Management, ERP Systems, Integration, Quality Services, Evaluation Framework

Title of the Paper: Statistical Methods and Applied Computing in Academic Educational Marketing


Authors: Angela Repanovici, Bogdan Alexandrescu, Razvan Enoiu

Abstract: The mission of universities and staff is to generate knowledge through scientific research. Their moral obligation to society who is paying for these researches is to provide free access to the research results for everyone, everywhere. Along with Open Access journals, the institutions needed to create institutional repositories for storing the universities’ scientific production. A marketing research is proposed, in form of an exploring, stratified inquiry, which was conducted inside “Transilvania” University of Brasov, regarding the attitude and behaviour of the academic community for the creation of a digital, free access repository. Google Scholar is the scientometric data base which can be consulted free of charge on the internet and which indexes academic papers, identifying also the afferent citations. The free Publish and Perish software can be used as an analysis instrument of the impact of the researches by analysing the citations through the h index. We present the exploratory study of Transilvania University of Brasov regarding the impact and the visibility of the scientific researches.

Keywords: Statistical methods, institutional repositories, open access, marketing research, h index, scientometric indicators

Title of the Paper: Power-Aware, Depth-Optimum and Area Minimization Mapping of K-LUT Based FPGA Circuits


Authors: Ion Bucur, Nicolae Cupcea, Adrian Surpateanu, Costin Stefanescu, Florin Radulescu

Abstract: This paper introduces an efficient application intended for mapping under complex criteria applied to K-LUT based FPGA implemented circuits. This application is based on an algorithm that was developed taking into consideration a significant design factor - power consumption. Power consumption is considered in addition to other design factors that are traditionally used. To increase performance, it was used a flexible mapping tool based on exhaustive generation of all K-bounded sub-circuits rooted in each node of the circuit. Achieved information about logic dissipated power was obtained using an efficient dedicated simulator. In addition to lower power consumption, we devised several effective mapping techniques designed for reducing area and optimum depth.

Keywords: Power-aware, optimal area, K-LUT based FPGA, logic activity simulator, functional power

Title of the Paper: Innovation: Web 2.0, Online-Communities and Mobile Social Networking


Authors: Cheng-Jung Lee, Chang-Chun Tsai, Shung-Ming Tang, Liang-Kai Wang

Abstract: Recently, an online social network phenomenon has swept over the Web and the signs say that Social Networking Sites (SNS) are growing in importance not just as places for individuals to communicate, network, and express themselves but also as advertising and marketing vehicles. Combining social networks to the mobile environment is a growing interest as it allows users to be in their online social community despite their mobility. To date, several surveys and studies have brought some insights into this field. However, methods are often not general or detailed enough for evaluation and comparison. In this study we highlight latest trends and evolution of Mobile Social Networking and Online-Communities. The existing research is reviewed and organized to summarize what we know about their usage. This paper concludes with discussion of new developments, challenges and opportunities. There are many opportunities for future research and organizational applications of SNS as SNS adoption grows at incredible rates. What we present in this study can be generalized for other enterprise-grade Social Networks, either for their own business purposes or as a contract job for another company.

Keywords: Mobile, Social Networking, Web 2.0, E-Communities, Social Networks, Social Web

Issue 12, Volume 8, December 2009

Title of the Paper: Clustering the Source Code


Authors: Nadim Asif, Faisal Shahzad, Najia Saher, Waseem Nazar

Abstract: The systems are required to understand and present at higher levels of abstractions to perform the changes and meet the current requirements. When the changes are performed, the source code drifts away from the existing available system documentation (specifications, design, manuals), which represent the functionality of the software systems. The software systems are developed using the multi-languages with different dialects and scripts. This paper presents a clustering approach using the available source code, documentation, experience and knowledge about the domain and application to cluster the source code. The source code clustering is used for the purpose of recovering the artifacts, understanding the system and identifying the relationships among the source code to plan, design and execute the changes in the software systems.

Keywords: Source Code Clustering, Source Code Analysis, Re-Engineering, Reverse Engineering, Design Recovery, Program Understanding, Software Maintenance, Clustering.

Title of the Paper: Research of Replication Mechanism in P2P Network


Authors: Dongming Huang, Zong Hu

Abstract: P2P network is a dynamic self-organization network, in which peer can freely join or leave, so there will lost a lot of important data when some important nodes fail, and there exists load imbalance of node in the P2P network. These features are not good to the expansion of the P2P application, so this paper introduce replicate mechanism, not only enhances the reliability of the network, but also lets the network load balance, at the same time this paper introduce synchronize mechanism to solve the problem of data update consistency in the p2p network.

Keywords: P2P Network, Replication Mechanism, Load Transfer, Synchronize Mechanism, Data Update Consistency, Load Balance, Reliability

Title of the Paper: A New Recognition Method for Natural Images


Authors: Weiren Shi, Zuojin Li, Xin Shi, Zhi Zhong

Abstract: Natural images recognition is an important area of machine vision. This paper presents a novel approach for natural images recognition, based on the non-Gaussian distribution property of natural images. In this new method for recognition, first supervised classification is conducted to the natural images based on their label value, then independent components linear transforms are conducted to each category of samples, high-dimensional data are transformed to irrelevant independent components, and finally the probability distance between independent component subspaces is used for unsupervised classification. This classification tree also shows some features of signal processing of biological optic nerve. Experiment on ORL Face Database identity recognition shows that this method is featuring high recognition rate and low time consumption; meanwhile, another experiment is conducted on direction determination of intelligent robots autonomous navigation, also producing a good result.

Keywords: Natural image, independent component subspace, hierarchical discriminant regression, recognition of face, robot navigation

Title of the Paper: The Continuous Hopfield Networks (CHN) for the Placement of the Electronic Circuits Problem


Authors: M. Ettaouil, K. Elmoutaouakil, Y. Ghanou

Abstract: The Placement of the Electronic Circuits Problem (PECP) is considered as one of the most difficult optimization problems. The PECP has been expressed as a Quadratic Knapsack Problem (QKP) with linear constraints. The goals of this work are to solve the Placement of the Electronic Circuits Problem (PECP) using the Continuous Hopfield Networks (CHN) and to illustrate, from a computational point of view, the advantages of CHN by its implement in the PECP. The resolution of the QKP via the CHN is based on some energy or Lyapunov function, which diminishes as the system develops until a local minimum value is obtained. The Decomposition approach was used to solve the PECP. This method suffers from problems of feasibility of solutions and long training time. Unlike the decomposition approach, the CHN is much faster and all the solutions are feasible. Finally, some computational experiments solving the PECP are included.

Keywords: Placement of the Electronic Circuits Problem (PECP), Continuous Hopfield Networks (CHN), Quadratic Knapsack Problem (QKP), combinatorial problems, satisfaction of the PECP constraints

Title of the Paper: JEA K-128: A Novel Encryption Algorithm Using VHDL


Authors: Jamal N. Bani Salameh

Abstract: Data security is an important issue in computer networks and cryptographic algorithms are essential parts in network security. In this paper a new block- ciphering algorithm, JEA K-128 (for Jamal Encryption Algorithm with a Key of length 128 bits) is described. JEA K-128 is a symmetric block cipher suitable for hardware or software implementations. JEA K-128 has a 64-bit word size, 4-rounds, and 128-bit is the length of the secret key. New cryptographic features in our work include the use of many successive XORing for the plaintext with sub-keys, and the use of multiple multiplexers in sub-keys generator. The main goal of designing JEA K-128 algorithm is to reach the condition of having almost every bit of the ciphertext depend on every bit of the plaintext and every bit of the key as quickly as possible. Simulation study shows that JEA K-128 gives a strong Avalanche effect, when there is a change in one bit of the plaintext or one bit of the key; almost all bits in the ciphertext were changed. All codes for our algorithm were captured by using VHDL, with structured description logic. The reason for choosing VHDL is its suitability for hardware implementation. The design principles of JEA K-128 are given together with results and analyses to define the encryption algorithm precisely.

Keywords: Data encryption, block cipher, cryptography, key generator, VHDL implementation

Title of the Paper: The Modelling and Designing of E-Supervised (E-SUV) for Distance Learning Centre


Authors: Salehuddin Shuib, Suhardi Hamid, H. S. Hanizan, Roshidi Din, Kamaruzaman Jusoff

Abstract: E-services is a suite of web-based products brings profound changes in business models and value chains especially for Professional and Continuing Education Centre (PACE) environment due to its responsibility to manage all courses provided by every faculty to private colleges. PACE is currently running resourcefully but it is facing a few problems such as the difficulties of each faculty officer to get information about private colleges, and a lack of information management in private colleges itself. Therefore, the objective of this study is to propose a model of e-service application called E-Supervised (E-SUV) to provide the information related to twinning programmes between Universiti Utara Malaysia (UUM) and private colleges by enabling users to interact via portal by using the UML-based Web Engineering (UWE) approach. This methodology includes four modeling activities which are requirement analysis, conceptual model, navigation model and presentation model. This research results in a web-based environment consisting of major modules such as profiles, forums, and email service for users to utilize. In a nutshell, UWE approach is a viable option to facilitate e-services in education environment and thus could serve as a guideline model of future e-distance learning centre. Rigorous tests to see how this model behaves under real-life practice would be a way forward to identifying its effectiveness.

Keywords: E-Services, UML, UWE, PACE, UUM, E-SUV

Title of the Paper: A Consistency Maintenance Project Independent Relationship of Distributed Data Copy for Supply Chain Management


Authors: Jui-Wen Hung

Abstract: With global transnational enterprise layout trend, data consistency convergence study is a key of improving competitions and Bullwhip Effect problem in Supply Chain. This paper proposed a specific ODMS dependent replicas correctness maintenance engineering project. A resolution model provides adaptive adjustability update routing policies in hierarchical enterprise database. On the basis of real time updating process framework, several models are built. Different value added update service schema was also analyzed. A comprehensive update of the enterprise data that resident in internal heterogeneity of supply chain knowledge base existed partially dependency relationships with specified ODMS. With this paper on a comprehensive update mechanism of the organization in both dependencies spread copy of the data using flexibility and globally consistence maintenance mechanisms to reach enterprise internal and external data consistency.

Keywords: Internet Data Base Original Source (IDBOS), Region Data Content Proxy (RDCP), ODMS, portal Agent, global consistency, Supply Chain Management

Title of the Paper: The C Compiler Generating a Source File in VHDL for a Dynamic Dataflow Machine being Executed Direct into a Hardware


Authors: Jorge Luiz E. Silva, Kelton A. P. Da Costa, Valentin Obac Roda

Abstract: In order to convert High Level Language (HLL) into hardware, a Control Dataflow Graph (CDFG) is a fundamental element to be used. Related to this, Dataflow Architecture, can be obtained directly from the CDFG. The ChipCflow project is described as a system to convert HLL into a dynamic dataflow graph to be executed in dynamic reconfigurable hardware, exploring the dynamic reconfiguration. The ChipCflow consists of various parts: the compiler to convert the C program into a dataflow graph; the operators and its instances; the tagged-token; and the matching data. In this paper, a C compiler to convert C into a dataflow graph and the graph implementation in VHDL is described. Some results are presented in order to show a proof-of-concept for the project.

Keywords: C Compiler, Dynamic Dataflow Architecture, Dynamic Reconfigurable Hardware, Tagged-token, Matching-Data

Title of the Paper: Rotary-Code: Efficient MDS Array Codes for RAID-6 Disk Arrays


Authors: Yulin Wang, Guangjun Li

Abstract: Low encoding/decoding complexity is essential for practical RAID-6 storage systems. In this paper, we describe a new coding scheme, which we call Rotary-code, for RAID-6 disk arrays. We construct Rotary-code based on a bit matrix-vector product similar to the Reed-Solomon coding, and provide the geometry encoding method and detailed non-recursive decoding algorithms. The capability of two-disk fault-tolerance and the property of Maximum Distance Separable (MDS) are proved in Rotary-code. The key novelty in Rotary-code is that the Rotary-code has optimal encoding complexity and optimal decoding complexity comparing with existing RAID-6 codes.

Keywords: Array code; MDS; RAID; Two-disk fault-tolerance; Efficient decoding; Decoding complexity


[Journals], [Books], [Conferences], [Research], [Contact us], [Indexing], [Reports], [E-Library], [FAQ],
[Upcoming Conferences with Expired Deadline], [History of the WSEAS Conferences]

Copyright © WSEAS