WSEAS CONFERENCES. WSEAS, Unifying the Science

Main Page of the Journal                                                           Main Page of the WSEAS


 Volume 5,  2008
ISSN: 1790-0832
E-ISSN: 2224-3402








Issue 1, Volume 5, January 2008

Title of the Paper:  Application of Image Processing to Measure Road Distresses


Authors: Maher I. Rajab, Mohammad H. Alawi, Mohammed A. Saif

Abstract: Pavements usually experience different types of distresses due to repeated traffic loads, aggressive environmental conditions, construction materials, soil condition of the underline subgrade, and the method of construction. Longitudinal and transverse cracking, potholes, rutting, and bleeding are common examples of such distresses associated with flexible pavements. As time progresses, the severity of these distresses increases and consequently, ride quality is adversely affected. Early detection and measurement of the extent of distresses coupled with prompt reactive measures are necessary to keep the pavement function at an acceptable level. Traditional methods for distress detection and measurement are laborious, time consuming, and subject the involved personnel to accidents. In contrast, image measurement methods are effortless, safe, and can be performed in a short time. This research paper performs image processing measurements to estimate areas of a pothole and alligator cracking, and sets a program for plane measurements of an area that experience rutting. The image measurements are compared with the traditional measurements. The results show that image measurements are close to those obtained by using the traditional methods.

Keywords: Pavement surface cracking, road distresses, alligator cracking, rutting, image measurements.

Title of the Paper: A Design and Implementation of a Web Server Log File Analyzer


Authors: Yu-Hsin Cheng, Chien-Hung Huang

Abstract: More and more enterprises use network to communicate with the suppliers or the customers, and also in receiving order forms, data transmission, produces goods, stock warehousing in all enterprise management procedure, carry on to the e-movement and enterprise competitive ability, this act not only may save the production cost, moreover may fast reflect the correlation information. But the enterprise owners or the website administrators often cannot judge precisely whether their commercial websites play the leading role in marketing function. In the settlement of this problem, an intelligent website analysis software can often provide good and effective assistance to business owners or website administrators, and then improve the marketing results of the websites. The website analysis software usually uses the log files in the system to produce statistical-related data, and then obtains the useful information through the analysis. In this paper, we discuss the implementation of an enterprise website analysis software we have developed, and by the modification and the expansion the Analog from free software to analyze the log files in the Apache web server, provides more important information about visitors. We provide a user friendly interface in the system for the enterprise administrators to accumulate the important information of log files, in establishing the system, we add on functions which detect illegal information, make the protective action, and provide the dynamic E-Mail report function to the user.

Keywords: log file, web server, website analysis, Configuration Interface Subsystem, Illegal Information Detection Subsystem, Report Delivery Subsystem

Title of the Paper: Batu Aceh Typology Identification Using Back Propagation Algorithm


Authors: Azlinah Mohamed, Faizatul Huda Bt Mat, Sofianita Mutalib, Shuzlina Abdul Rahman, Noor Habibah Arshad

Abstract: Historical and cultural artifacts have defined the existence of humankind all over the world. It is therefore important to preserve this heritage for future endeavors. In the Malay-Indonesia Archipelago, the forgotten and extinct artifact is the Islamic gravestone that originated in Aceh known as Batu Aceh. This research is an attempt to preserve our heritage by developing a prototype to guide the future generation to appreciate these precious cultural heritage artifacts. Back propagation algorithm is applied on a supervised classification of Batu Aceh object images. Several images of each type of Batu Aceh are used as training samples. Data samples were converted into binary forms and used to train the network. To ensure the performance of the system, network parameters such as momentum value, learning rate and number of hidden neuron would be adjusted to get the appropriate setting. It is found that the prototype devised can identify the type of Batu Aceh presented and its century. This research would provide experts in this field an alternative to identify damaged or unclear images of Batu Aceh.

Keywords: Artificial Neural Network, Back Propagation, Batu Aceh, Image Processing, Image Classification and Pattern Recognition.

Title of the Paper: A Neuro-Fuzzy Model for Function Point Calibration


Authors: Wei Xia, Luiz Fernando Capretz, Danny Ho

Abstract: The need to update the calibration of Function Point (FP) complexity weights is discussed, whose aims are to fit specific software application, to reflect software industry trend, and to improve cost estimation. Neuro-Fuzzy is a technique that incorporates the learning ability from neural network and the ability to capture human knowledge from fuzzy logic. The empirical validation using ISBSG data repository Release 8 shows a 22% improvement in software effort estimation after calibration using Neuro-Fuzzy technique.

Keywords: neuro-fuzzy, neural networks, fuzzy logic, software cost estimation.

Title of the Paper: Applications of Fuzzy Theory on Health Care: An Example of Depression Disorder Classification Based on FCM


Authors: Sen-Chi Yu, Yuan-Horng Lin

Abstract: The purpose of this study is to apply fuzzy theory on health care. To achieve this goal, Beck Depression Inventory (BDI)-II was adopted as the instrument and outpatients of a psychiatric clinic were recruited as samples and undergraduates as non-clinical sample as well. To elicit the membership degree, we asked the subjects are free to choose more than one alternative for each item listed in BDI and, in turn, assign percentages on the chosen alternatives. Moreover, the sum of percentages of the chosen categories is restricted to 100%.We performed the possibility- based (fuzzy c-means, FCM) and probability-based classification (Wald’s method and k-means) to classification of severity of depression. The scoring of BDI of subjects were analyzed by clustering analysis while the diagnose of depression-severity by a psychiatrist was used as the criterion to evaluate classification accuracy. The percentage of correct classification among FCM, Wald’s method and k-means were compared. The analytical results show the Kendall's
τ coefficient of FCM, Wald’s method and k-means were .549, .316, and .395, respectively. That is, FCM exhibited a higher association between the original and classified membership than did Wald’s and k-means methods. We concluded that FCM identified the data structure more accurately than the two crisp clustering methods. It is also suggested that considerable cost concerning prevention and cure of depression might be reduced via FCM.

Keywords: fuzzy c-means, depression, fuzzy logic, psychological assessment.

Issue 2, Volume 5, February 2008

Title of the Paper:  A Detecting Peak’s Number Technique for Multimodal Function Optimization


Authors: Qiang Hua, Bin Wu, Hao Tian

Abstract: A Detecting Peak's Number (DPN) technique is proposed for multimodal optimization. In DPN technique, we want to know the peak’s number of locally multimodal domain of every individual, firstly we use the idea of orthogonal intersection for getting the exploration direction in every locally multimodal domain, and then we attempt to detect peak’s number in every one-dimension direction as the result of detecting of locally multimodal domain. At last we design an evolution algorithm (DPNA) based on the characters of DNP technique, which contain four characters: niching, variable population, variable radius and life time, and then give a series of experiment results which show the effectiveness of algorithm, as the DPNA is not only adapting to obtaining multiple optima or suboptima, but also effective for problem of ill-scaled and locally multimodal domain described in [11].

Keywords: Evolution Algorithm; Multimodal Function Optimization; Niching; Detecting Peak's Number

Title of the Paper: Context-Dependent Extensible Syntax-Oriented Verifier
with Recursive Verification


Authors: Nhor Sok Lang, Takao Shimomura, Quan Liang Chen, Kenji Ikeda

Abstract: In order to develop Web applications of high quality, it is important to apply efficient frameworks to standardize the process of development in projects, or apply useful design patterns to produce the program code that can easily be enhanced. However, it is not enough. In addition to these efforts, we have to check the programs to see whether they keep various kinds of rules such as verification items for security which are common for all kinds of Web applications, and verification items for coding styles or code conventions which are pertain to each project. This paper proposes a verification method for the syntax-oriented verifier, and describes the implementation of its prototype system, SyntaxVerifier. SyntaxVerifier makes it possible to detect syntactical objects based on their syntactic contexts. It realizes a recursive verification which makes it easy to dynamically trace a syntax tree in a verification process itself.

Keywords: Customaizable, Extensible, Recursive, Syntax analysis, Verification

Title of the Paper: Context-Dependent Extensible Syntax-Oriented Verifier
with Recursive Verification


Authors: Nhor Sok Lang, Takao Shimomura, Quan Liang Chen, Kenji Ikeda

Abstract: In order to develop Web applications of high quality, it is important to apply efficient frameworks to standardize the process of development in projects, or apply useful design patterns to produce the program code that can easily be enhanced. However, it is not enough. In addition to these efforts, we have to check the programs to see whether they keep various kinds of rules such as verification items for security which are common for all kinds of Web applications, and verification items for coding styles or code conventions which are pertain to each project. This paper proposes a verification method for the syntax-oriented verifier, and describes the implementation of its prototype system, SyntaxVerifier. SyntaxVerifier makes it possible to detect syntactical objects based on their syntactic contexts. It realizes a recursive verification which makes it easy to dynamically trace a syntax tree in a verification process itself.

Keywords: Customaizable, Extensible, Recursive, Syntax analysis, Verification

Title of the Paper: Stroke Order Computer-based Assessment with Fuzzy Measure Scoring


Authors: Guey-Shya Chen, Yu-Du Jheng, Hsu-Chan Yao, Hsiang-Chuan Liu

Abstract: The purpose of this research is to develop a computer-based assessment for stroke order and a novel algorithm based on Choquet integral with fuzzy measure for scoring stroke order of Chinese Character writing. The learning of stroke order is very important for a fixed stroke writing character system therefore this research based on the theory of “five indexing system of Chinese characters” and combine the features of strokes develops a stroke order assessment system to real-time evaluate learners’ stroke order and score their character writings. Except using traditional additive scoring method, a non-additive scoring algorithm based on Choquet integral with
λ-measure and L-measure was proposed. A real data set with 121 samples was examined and experiment results show that the performances of this novel scoring model based on Choquet integral with both λL-measure and -measure outperform the performance of traditional scoring approach.

Keywords: stroke order, computer-based assessment, five indexing systems of Chinese characters, λL-measure, -measure, Choquet integral

Title of the Paper: Analysis of Neural Network Edge Pattern Detectors in Terms of Domain Functions


Authors: Maher I. Rajab, Khalid A. Al-Hindi

Abstract: This paper investigates the analysis of feed-forward BP neural network that has been trained to detect noisy edge patterns, so as to achieve close insight into their internal functionality. The analysis of neural network edge detector’s hidden units, as templates, were analysed into three gradient components: low pass or averaging, gradient, and second-order gradients. The weights between NNets hidden units and their output units represent the importance of the hidden unit’s edge detection outcome. To this purpose, the elements of the NNets, that have been trained to detect prototype noisy edge patterns with various angle of operation, were analysed in terms of domain functions. The results show that the NNets analysis using the domain functions method could confirms the results of NNets recognition accuracies. Although the work presented only gives some analysis results for the units in the NNets hidden units, it should be clear that a characterization of the neural network as a whole could also be derived from these results.

Keywords: Neural networks analysis, domain functions, BP, recognition accuracy, domain-specific base functions, Taylor series coefficients.

Title of the Paper: Using System Dynamics for Managing Risks in Information Systems


Authors: Denis Trcek

Abstract: Each and every security oriented activity in information systems has to start with the basics, which is risk management. Although risk management is a well established and known discipline in many other areas, its direct translation to information systems is not an easy and straightforward because of specifics of contemporary information systems. Among these specifics there are the global connectivity of information systems, the large number of elements (e.g. thousands of software components), strong involvement of human factor, almost endless possible ways of interactions, etc. Thus a new methodological approach is presented in this paper that is based on business dynamics. It enables effective addressing of the above-mentioned elements, and through this it supports and improves decision making in information systems security.

Keywords: information systems, security, risk management methodologies, system dynamics, simulations, reference models.

Title of the Paper: Contrast Enhancement and Clustering Segmentation of Gray Level Images with Quantitative Information Evaluation


Authors: Zhengmao Ye, Habib Mohamadian, Su-Seng Pang, Sitharama Iyengar

Abstract: Improper illumination and medium dispersing could occur in quite some gray level image collecting processes. Contrast enhancement and clustering segmentation are two effective approaches for the related pattern recognition problems. Image enhancement and image segmentation can be applied to different areas of science and engineering, such as biometric identification, national defense and resource exploration. Thus, adaptive image enhancement can be implemented to improve the image quality and to reduce random noise simultaneously, which is used to adapt to the intensity distribution within an image. Nonlinear K-means clustering can be applied for image segmentation, which is to classify an image into parts which have strong correlations with objects in order to reflect the actual information being collected. For example, it can be used against the effects from unevenly distributed pressure or temperature condition under atmosphere medium and water medium. To evaluate the actual roles of image enhancement and image segmentation, some quantity measures should be taken into account. In this study, a set of quantitative measures is proposed to evaluate the information flow of gray level image processing. Concepts of the gray level energy, discrete entropy, relative entropy and mutual information are proposed to measure outcomes of the adaptive image enhancement and K-means image clustering.

Keywords: Contrast Enhancement, K-Means Segmentation, Gray Level Image, Energy, Entropy, Relative Entropy, Mutual Information

Title of the Paper: A Case Study of Usability Testing – The SUMI Evaluation Approach of the EducaNext Portal


Authors: Tanja Arh, Borka Jerman Blazic

Abstract: The new challenge for designers and Human–computer interaction (HCI) researchers is to develop software tools and applications for effective e-learning. Software usability is one aspect of Human–computer interaction that can benefit from knowledge of the user and their tasks. However, in practice not much attention is given to this issue during testing. Evaluators often do not have the knowledge, instruments and/or time available to handle usability. One set of methods for determining whether an application enables users to achieve their predetermined goals effectively and efficiently is usability evaluation with end users. The paper presents the results of empirical study of usability evaluation which was based on SUMI (Software Usability Measurement Inventory) questionnaires. The software application tested was a multilingual educational portal EducaNext.

Keywords: human computer interaction (HCI), EducaNext educational portal, distance learning, software usability measurement inventory (SUMI), usability, end users

Title of the Paper: Dynamic Backward Scheduling Method for Max-Plus Linear Systems with a Repetitive, MIMO, FIFO Structure


Authors: Hiroyuki Goto, Shiro Masuda

Abstract: This paper proposes a dynamic scheduling method that considers no-concurrency of resource with a subsequent event. We focus on repetitive discrete event systems with a MIMO (Multiple Inputs and Multiple Outputs)-FIFO structure. The behavior of this kind of system can be described with linear equations in the max-plus algebra, also referred to as MPL (Max-Plus Linear). Conventional MPL representation formulates no-concurrency of resource with a previous event and can provide the earliest starting times in internal facilities. We recently proposed a calculation method for a single job; however, resource no-concurrency with a subsequent job is not considered. Thus, it may not give an optimal solution if a large number of jobs have to be considered simultaneously. Therefore, we derive a general form of the MPL representation that provides the latest start times considering no-concurrency with a subsequent event. We also consider an effective rescheduling method that is applicable in cases in which the system parameters are changed after the job has commenced.

Keywords: Repetitive system, state-space representation, max-plus linear system, earliest/latest time, forward/backward type, rescheduling

Title of the Paper: Intelligent Support to Specific Design Aspects


Authors: Urska Sancin, Jasmin Kaljun, Bojan Dolsak

Abstract: Due to increased rivalry on the market, some specific design aspects like the use of modern plastics materials, ergonomics and aesthetics are becoming more important and are not kept in the background any more in correlation with functionality and economic aspect of the product. Design engineer faces many dilemmas while designing new products as a single person is not able to possess a wide spectrum of knowledge needed for optimal design solutions. At this point designer have to rely on his or her experience and on the knowledge of the expert team involved in the project. The fundamental purpose of the research presented in this paper is to make the product development process less experience dependent. The main goal of thematically oriented research is to develop intelligent advisory system with integrated modules for some specific design aspects.

Keywords: product development, ergonomics, aesthetics, plastics, knowledge based engineering, intelligent systems

Title of the Paper: Mining Long High Utility Itemsets in Transaction Databases


Authors: Guangzhu Yu, Shihuang Shao, Xianhui Zeng

Abstract: Existing algorithms for utility mining are column enumeration based, adopt an Apriori-like candidate set generation-and-test approach, and thus are inadequate on datasets with high dimensions or long patterns. To solve the problem, this paper proposes a hybrid model and a row enumeration based algorithm, i.e., inter-transaction, to discover high utility itemsets from two directions: existing algorithms such as UMining [1] can be used to seek short high utility itemsets from the bottom, while inter-transaction seeks long high utility itemsets from the top. By intersecting relevant transactions, the new algorithm can identify long high utility itemsets directly, without extending short itemsets step by step. In addition, new pruning strategies are used to cut down search space; optimization technique is adopted to improve the performance of the intersection of transactions. Experiments on synthetic data show that our method achieves high performance, especially in large high dimensional datasets.

Keywords: long high utility itemset; hybrid mode; utility; intersection transaction; row enumeration

Issue 3, Volume 5, March 2008

Title of the Paper: An Interactive Web-Based Wedding Planner with Comparative Analysis Decision Support System


Authors: L. Y. Por, R. F. Boey, T. F. Ang, C. S. Liew

Abstract: Preparing for weddings is always tedious, especially when it does not involve hired help. Long checklists await soon-to-be brides and grooms before their auspicious wedding ceremony. Without experienced assistance, these brides and grooms face frustrating situations in hunting for suitable bridal products and services. This work presents an ideal one stop solution, called the Wedding Arch, for the brides and grooms to retrieve information on available bridal products and services in the shortest possible time. This web based wedding planner provides a platform for brides and grooms to acquire information on bridal products and services, as well as information of vendors registered with Wedding Arch and make wedding planning reservations online with the simple click of a mouse. Thus, the long and tedious task of information gathering has been shortened and made more convenient. Most importantly, Wedding Arch also functions as a web based comparative analysis decision support system that allows the brides and grooms to subscribe to a service that will assist them in the process of wedding planning and preparations. The system assists brides and grooms in making decisions based on their preferences and budget while taking in updated and current market pricings for their desired bridal products and services.

Keywords: Wedding Arch, Wedding planner, Decision support system (DSS), Budget planning and Web-based.

Title of the Paper: FAQ-master: An Ontological Multi-Agent System for Web FAQ Services


Authors: Sheng-Yuan Yang, Chun-Liang Hsu, Dong-Liang Lee, Lawrence Y. Deng

Abstract: This paper expresses the results of our research in developing a multi-agent system: FAQ-master as an intelligent Web information integration system based upon intelligent retrieval, filtering, integration, and ranking capabilities in order to provide highquality FAQ answers from the Web to meet the user information request. We are describing FAQ-master, discussing how it improves FAQ query quality from the following three aspects of Web search activities at the same time: user intention, document content processing, and website search. The system is implemented as four agents working together through an ontology-supported content base. Techniques involved in the design include domain ontology, ontology-supported user query processing, ontology-supported solution caching, ontology-directed FAQ wrapping, storage, retrieval and ranking, and ontology-supported website classification and expansion.

Keywords: Ontology, User Modeling, Proxy Mechanism, Website Modeling, Multi-Agent Systems.

Title of the Paper: Online and Post-processing of AVL Data in Public Bus Transportation System


Authors: Dejan Rancic, Bratislav Predic, Vladan Mihajlovic

Abstract: For a number of years vehicle fleet tracking systems are in use in companies operating large number of vehicles in the field. Proliferation of cheap and compact GPS receivers had the effect that Automatic Vehicle Location (AVL) systems today almost exclusively use satellite based locating systems. This paper presents characteristics of realized system tracking and analyzing city bus transit traffic in the city of Niš. Since the described system also uses GPS for vehicle location, initially, data about vehicles’ locations acquired from AVL subsystem need to be transformed into form that is adequate for further usage in the system, analysis and prediction. Also, city bus transit tracking systems are specific concerning functionalities differing them from other vehicle fleet tracking systems. These specifics include cyclic vehicle routes, the need for specific performance reports etc. This paper specifically deals with vehicle motion prediction with on station arrival time estimation coupled with automatic report generation.

Keywords: Automatic Vehicle Location (AVL), Prediction of arrival times, Continuous query processing, Information services, Intelligent transportation systems (ITS), line matching

Title of the Paper: Questionnaire-based Evaluation of Characteristics of a Community in SNS


Authors: Susumu Takeuchi, Masanori Akiyoshi, Norihisa Komoda

Abstract: Currently, SNSes (Social Networking Services) are widely available on the Internet. In a SNS, users can communicate with the users who have a similar interest in a community. To activate communications drastically in SNS, users are encouraged to join other dissimilar communities. In this paper, utilizing activities of communities and relationships with communities is proposed to realize a dissimilar community recommendation. However, such communities tend to be useless for the users, so that investigating characteristics of communities that are selected by the methods is necessary to recommend. By utilizing real data in the largest SNS in Japan, the correlations between the users’ subjective judgments and the characteristics of communities are evaluated. As a result, it is clarified that effective recommendation of dissimilar communities will be possible by integrating activities of communities and relationships with communities.

Keywords: SNS, Social network, Interpersonal communication, Activation, Community, Recommendation

Title of the Paper: An Extraction of Emotion in Human Speech Using Speech Synthesize and Classifiers for Each Emotion


Authors: Masaki Kurematsu, Jun Hakura, Hamido Fujita

Abstract: The typical method of estimation of emotion in speech has the 3 steps. First, researchers collect a lot of human speech. Next, researchers get speech features from human speech using frequency analysis and calculate the statistical value of them. Finally they make a classifier from the statistical value using a learning algorithm. Most researchers consider the collection of human speech, feature selection and learning algorithm to increase the validity of estimation. But the validity of estimation is not high. In this paper, we propose the 3 new methods to enhance the typical method of estimation of emotion in speech. First method is that we use synthetic speech to make a classifier. Second method is that we use not only mean and maximum but also Standard Deviation (SD), skewness and kurtosis to make a classifier. Third method is that we use the classifier for each emotion. In order to evaluate our approach, we did experiments. Experimental results show the possibility in which our approach is effective for improving the former method.

Keywords: Emotion, Speech Synthesize, Regression Tree, Linear Discriminant Analysis, Extraction of Emotion in Speech

Title of the Paper: Prototype of Knowledge Management System in Chinese Offshore Software Development Company


Authors: Li Cai, Zuoqi Wang, Yufeng Jiao, Masanori Akiyoshi, Norihisa Komoda

Abstract: This paper illustrates preliminary design and evaluation of a knowledge managing system in Chinese offshore software development company. The major features of the knowledge management system are a BBS style Q&A function, a management of Q&A results based on domain knowledge, an association retrieval function for coping with grammatical ambiguity including in the queries, and a display of Q&A texts on the plain for grasping the general distribution of knowledge and problems. The prototype system has developed using JAVA and .Net Frameworks, and has used in-house. Through the trial use in-house, not only the advantage but also several improvement problems are clarified.

Keywords: knowledge management, question and answer system, association retrieval

Title of the Paper: Decision Making With Textual and Spatial Information


Authors: Hana Kopackova, Jitka Komarkova, Pavel Sedlak

Abstract: The aim of the paper is to show the way how textual and spatial information can be used in decision making process. Structured information represents only 10 % of available information. Despite of this fact managers mostly rely on this type of information. Due to their marginalization in decision making practice, we focused on textual and spatial information in this article. Four case studies clearly illustrate usage of these types of information on practical examples.

Keywords: Decision Making, Information, Text Categorization, Clustering, GIS.

Title of the Paper: Application of Social Relation Graphs for Early Detection of Transient Spammers


Authors: Radoslaw Brendel, Henryk Krawczyk

Abstract: Early detection of social threats and anomalies is a real challenge in today’s dynamic societies. The people form many complex social relations that can be shown by various types of graphs in which the nodes would represent the subjects (individuals or groups of people) and the links would indicate specific relations between them. The analysis of these constantly changing relations can point out specific social threats that are imminent. Observing the tendency of changes in the social relation graphs, such threats can be early detected and adequate preventative steps can be taken. In the paper we present how this approach can be efficiently used to early detect imminent threats of spam to a local e-mail society and isolate groups of spammers before their messages reach the users inboxes.

Keywords: social graphs, imminent threats, graph patterns, spam detection, security

Title of the Paper: A New Method for Enhanced Information Content in Product Model


Authors: Laszlo Horvath

Abstract: This paper introduces a new method in order to establish a better communication between engineers and modeling procedures during lifecycle management of product information in computer based engineering systems. The proposed method is intended as one of the initial attempts for resolution of an inherent conflict in product modeling. This conflict is between information content based thinking of engineer and data based processing in classical product modeling systems. The author applies a new sector of product model that describes information content for engineering activities. This sector controls engineering object data in the second sector for classical product model entities. In this manner, the new modeling can be interconnected with existing modeling in industrially applied engineering systems. Information content is stored in interconnected levels of the product model for human intent, meaning of concepts, engineering objectives, contexts, and decisions. This paper explains basic concepts, refers to research in the related topics, introduces role of humans and human-computer interactions in the proposed modeling, and details information content based product modeling. Following this, it emphasizes difficulties of decisions in case of high number of dependencies amongst engineering objects in large product models and proposes a new method for change management in those models. Implementation of the proposed product modeling methods can be realized in industrial professional modeling systems by using of their functionality for open architecture.

Keywords: Product modeling, Lifecycle management of product information, Information content based product model, Design intent, Product changes, Affect zone of modified engineering object

Title of the Paper: Warehouse Redesign to Satisfy Tight Supply Chain Management Constraints


Authors: Roman Buil, Miquel Angel Piera

Abstract: The picking process is a critical supply chain component for many companies. A proper warehouse configuration, storage policy, trays replenishment policy, and other factors are also important to reduce not only the delivery time, but also increase productivity while maintaining quality factors at competitive costs. This paper presents an integrated approach to tackle the complexity of warehouse redesign under several space and quality factor constraints. The proposed design methodology integrates warehouse layout configuration, storage policy, replenishment policy, picking policy, routing policy and also trays and shelves sizes design. As a result of the proposed strategy, considerable savings on resource costs (equipment and workers) can be achieved.

Keywords: Picking, warehouse, simulation, colored petri nets, logistics, supply chain.

Title of the Paper: Outcome Based Education Performance Measurement: A Rasch-based Longitudinal Assessment Model to measure Information Management Courses LO’s


Authors: Azlinah Mohamed, Azrilah Abd. Aziz, Abd. Rahman Ahlan Sohaimi Zakaria, Mohdsaidfudin Masodi

Abstract: Malaysia Qualification Framework, 2005 (MQF) promotes outcome based education (OBE) learning process. OBE calls for the evaluation of the course’s Learning Outcomes (CLO) as specified in the Program Specification. This good practice is implemented in the Faculty of Information Technology and Quantitative Science, Universiti Teknologi MARA (FTMSK) teaching and learning processes which was duly certified to ISO 9001:2000. Assessment methods include giving students’ tasks such as tests, quizzes or assignments at intervals during the 14 weeks study period per semester. CLO’s were evaluated based on the students’ performance which gives an indication of their learning achievements. Despite the marks obtained is orderly in manner, it is on a continuum scale. Hence, further evaluation using the raw score is rather complex to be carried out. This paper describes a Rasch-based measurement model as performance assessment tool to measure the CLO. Rasch Model uses ‘logit’ as the measurement unit thus transforms the assessment results into a linear correlation. An overview of the measurement model and its key concepts were presented and illustrated. An assessment form using Bloom’s Taxonomy as parameter was designed showing each dimension of the ability to be measured. The results there from were evaluated on how well it relates to the attributes being assessed and scrutinized. It is further checked against the CLO Maps for consistency and used as a guide for future improvement of the teaching method and style. This provides the lecturers a more accurate insight of the student level of competency achieved. The study shows that this model of measurement can classify students’ grades into linear competency scale accurately using only very few primary data sets to enable corrective action to be taken effectively at the early stage of learning.

Keywords: Learning Outcomes, performance assessment, evaluation, competency, Bloom’s Taxonomy, quality.

Title of the Paper: Testing Cognitive Characteristics of Users in Interaction with Computer


Authors: Nebojsa Dordevic, Dejan Rancic

Abstract: This paper presents some new research results on Human Computer Interaction (HCI) methodologies. We present an extension of cognitive model for HCI - (XUAN/t), based on decomposition of user dialogue into elementary actions (GOMS). Using this model, descriptions of elementary (sensor, cognitive and motor) actions performed by user and system are introduced sequentially, as they will happen. Based on the described model and psychometric concepts, we developed software CASE tool for testing cognitive as well as psychomotor abilities of a user in HCI. Software tool arranges tests into test groups for psychosensomotor and memory capabilities. User test results are persistently stored in a database and available for further statistical analysis. The main research goal was suitability verification of different HCI techniques for special user groups. Case study was performed and numerical results verifying the proposed model are presented in the paper.

Keywords: HCI, User interface, Cognitive models, HCI testing tool, User profile.

Title of the Paper: PIRR: A Methodology for Distributed Network Management in Mobile Networks


Authors: Filippo Neri

Abstract: The current centralized Network Management approach in mobile networks may lead to problems with being able to scale up when new customer services and network nodes will be deployed in the network. The next generation mobile networks proposed by 3rd Generation Partnership Project (3GPP) envision a flat network architecture comprising some thousands of network nodes, each of which will need to be managed, in a timely and efficient fashion. This consideration has prompted a research activity to find alternative application architectures that could overcome the limits of the current centralized approaches. section approach to deal with the increasing network size and complexity is to move from a centralized Network Management system to a more distributed or decentralized approach where each Network Management application consists of a controller part and a set of distributed parts running on the individual network elements. This approach however raises questions of how to measure and estimate the resource requirements when planning to distribute the Network Management applications in the mobile networks before starting to produce commercial systems. In this paper, we describe the PIRR methodology we have developed to measure resource requirements for distributed applications in mobile networks and the experimental findings of its application to new Network Management applications.

Keywords: Software Agents, JADE, Telecom network management system, distributed simulation systems, 3G cellular system.

Title of the Paper: A Hierarchical Relevance Feedback Algorithm for Improving the Precision of Virtual Tutoring Assistant Systems


Authors: Ji-Wei Wu, Judy C.R. Tseng

Abstract: In recent years, several virtual tutoring assistant systems had been proposed based on question-answering systems. These virtual tutoring assistant systems are very helpful for the students to get instant helps when the teachers are not available. Pedagogic scholars think that when a student is stuck on a certain problem while learning, an instant tutoring assistant is very helpful to promote his/her study. Therefore, the qualities (precision) of the answers of virtual tutoring assistant systems are very important. It determines the effectiveness of these systems. In practices, some students are suffered from not being able to properly express their need; some may not even know exactly what information they need. As a result, some problems do not match any existing solutions, even if they do contain some clues for finding solutions. That makes the effectiveness of the virtual tutoring assistant systems not being acceptable. In the literatures, researchers found that relevance feedback information are quite useful for information retrieval systems to improve their effectiveness. Among them, the Rocchio’s Relevance Feedback (RRF) algorithm is the most well-known and have been employed in several information retrieval systems. In this paper, we proposed a novel pseudo relevance feedback algorithm, called Hierarchical Relevance Feedback (HRF) algorithm. The HRF algorithm is used for improving the precision of virtual tutoring assistant systems. After a query is submitted by some student, the new virtual tutoring assistant system will automatically modify the student’s query according to the HRF algorithm and re-submit it to the system. Experimental results showed that the effectiveness of the system is improved by automatically modifying user’s query using the HRF algorithm. Moreover, the HRF algorithm is also outperformed the famous RRF algorithm.

Keywords: E-learning, Tutoring assistant, Information Retrieval, Question-answering, Relevance feedback

Issue 4, Volume 5, April 2008

Title of the Paper: Analysis of the Potential for Nearly Circular Slope Failure Using On-site Survey Information with Adverse Calculation


Authors: Tienfuan Kerh, Y. M. Wang, H. C. Tsai

Abstract: Slope stability analysis is crucial for decreasing unwanted economical loss at mountain highways which are prone to damage from heavy rains. The objective of this study is to perform on-site investigation and adverse calculation of slope damage at a specified road section along Tai-27 highway in Taiwan. By analysing a total of 38 on-site survey results, it is found that the types of slope damage may include debris flow (26%), shallow soil layer sliding (26%), nearly circular sliding (8%), slope oriented sliding (3%), falling rock (11%), river erosion (13%), and others (13%). Based on available geological information at three locations with nearly circular slope failure, the adverse calculations using computer software show that these slopes all failed at a shallow soil layer. Some of the important soil strength parameters such as cohesion force and friction angle are estimated, and a critical sliding surface is predicted for a minimum safety factor. The results provide useful information for choosing a suitable engineering protection work at these hazardous locations by the relevant agency.

Keywords: mountain highway, circular slope failure, adverse calculation, on-site survey, factor of safety, cohesion force, friction angle, critical sliding surface

Title of the Paper: Analysis on the Adaptive Scaffolding Learning Path and the Learning Performance of e-Learning


Authors: Chun-Hsiung Lee, Gwo-Guang Lee, Yungho Leu

Abstract: Existing instruction websites record learners’ portfolios, they only collect the browsing time and homepage information, without directly provide teachers with more data for further analyzing learner behaviors. Consequently, this investigation uses the learners’ portfolio left in the e-learning environment, and adopts “data mining” techniques to establish for each cluster of learners the most adaptive learning path pattern, which can provide a “scaffolding” to guide each cluster of learners. Using statistical methods, this investigation analyzes whether the navigational learning map of “scaffolding learning path (SLP)” can improve learning performance. This investigation discovers that among the three clusters of learners, the learners in the experimental group under the two clusters other than high-score cluster, after taking the “scaffolding learning path” as their navigational learning map, achieve more significant progress than the learners of comparative group. This implies that through the “scaffolding learning path,” the learning performance of most learners can be improved.

Keywords: E-learning, Learning path, Scaffolding theory, Learning portfolio, Data mining

Title of the Paper: Using Integer Programming to Solve the Crew Scheduling Problem in the Taipei Rapid Transit Corporation


Authors: Chui-Yen Chen

Abstract: The transport operation volume has reached millions of people since the Taipei Rapid Transit Corporation (TRTC) started operating 28 March 1996. TRTC has become the key public transportation in Taipei city. Crew scheduling of station employees has become one of the basic costs of operating the Public Rapid Transit System. During the planning stage of the project, a mathematical model of planning phases was established and used to find answers to crew scheduling and reduce costs. In general, the crew scheduling problems relate to the “set covering” problem or “set partitioning” problem; therefore, it is often described by the methods of 0-1 Integer Programming. Hence, this study will adopt the methods of integer programming for mathematical programming to establish a crew scheduling program. The Target-based formula of using integer programming to solve the crew shift scheduling problem is meant to meet the interests of station employees. Furthermore, this study will use comparison analysis to research (1) common shift scheduling; (2) employee’s preferred working hours; and (3) specified work day for employees to help solve day off or other problems. We hope to provide this method to the TRTC in a rapid and convenient timetable to improve the efficiency of operations and business competition.

Keywords: Crew scheduling problem (CSP), 0-1 Integer programming, Taipei Rapid Transit Corporation (TRTC)

Title of the Paper: Malaysian Smart School Courseware Usability study: The Effectiveness of Analytical Evaluation Technique Compared to Empirical Study


Authors: Azizah Jaafar

Abstract: Usability evaluation methods are used to evaluate the interaction of the human with the computer for the purpose of identifying aspects of interaction that could be improved t increase usability. This research was carried out to determine the effectiveness of analytical evaluation technique that was designed, developed and named as Jalan Rentasan Kognitif (JRK) evaluation technique. This evaluation technique was conducted on thirty-seven surrogate users that consist of courseware developers, teachers and university students using the Malaysian Smart School Mathematics courseware. The usability result was then compared statistically with the empirical usability evaluation called Task Analysis Exploratory through Observation using video camera (TAEO). The respondents of the empirical method were secondary school students who used the same courseware as one of the learning aids in school. Usability indicators used in the study were usability problems and users’ satisfaction. The study showed that the effectiveness of the two techniques was quite similar if they were used independently during the evaluation process. The usability indicators obtained in the study could be used as references in the usability comparisons of other educational courseware.

Keywords: Usability Evaluation Technique, Task Analysis, Surrogate user, Usability problems.

Title of the Paper: Analysis on Types of Mobile Games Played among the IHL Students in Malaysia


Authors: Hafizullah Amin Hashim, Siti Hafizah Ab Hamid, Mas Idayu Md Sabri

Abstract: Mobile phones are no longer a luxury but are now considered a necessity by its users. Many applications are developed to replace electronic devices and extra ‘baggage’ that people used to carry around with them. A mobile phone now also serves as a planner, camera, music player, video player, gaming console, calculator and other useful gadgets. Mobile gaming is getting more attention from gamers as one of the most preferable type of digital games. One good reason that contributes to the success of mobile game is because of its mobility, which means gamers can play games anytime and anywhere. In this paper, we examine the usage of mobile games as compared to other types of digital games such as console or arcade games in Malaysia. We also perform a series of analysis regarding to students preferences in every gaming phase from console or arcade games to mobile games. A survey on user preferences among students of Institute of Higher Learning (IHL) has been done to gather useful information and relevant data to support this paper. The survey showed that an overwhelming majority (60%) of the respondents prefer to play games on mobile phone. The data that have been gathered from the survey were analyzed using Statistical Package for Social Science (SPSS), Release 13.0.0, to investigate the number and percentage of students that use mobile games. The future direction of this study is discussed at the end of the paper.

Keywords: Mobile game, mobile learning, mobile phone, programming.

Title of the Paper: Automated Decision Support System Based on Ordered Sources


Authors: Sylvia Encheva, Sharil Tumin

Abstract: Making a decision based on comparing responses coming from different sources has allways been difficult. In this paper we propose application of a decision support system for selecting a shipyard based on the quality of services provided by a number of shipyards.

Keywords: Quality of services, selection, Web-based systems.

Title of the Paper: The Introduction of Supply Chain Management Concepts to e-Government Research and Practice


Authors: Ales Groznik, Peter Trkman

Abstract: The paper deals with various aspects of e-government and highlights the importance of the holistic treatment of business process renovation in order to facilitate the transition towards e-government. It analyses both upstream and downstream supply chain renovation and describes a four-step procedure for downstream renovation. A new definition of e-government that includes the whole supply chain of the public administration is proposed. The chief problems of the proposed approach are analysed, along with some interesting topics for further research. The findings are partly illustrated with a case study of the development of the Slovenian eprocurement portal.

Keywords: e-government, e-procurement, supply chain management, business renovation, case study.

Title of the Paper: E-logistics: Slovenian Transport Logistics Cluster creation


Authors: Ales Groznik

Abstract: In the pursuit of enhanced competitiveness organisations are today searching for innovative business models in order to foster economic benefits. In Slovenia, several clusters are being formed, including the Slovenian Transport Logistics Cluster (STLC) as one of the most important. The STLC is currently in the stage of dynamic growth, demanding business model formation and adequate informatisation. The main goal of the paper is to present the informatisation of the STLC, bridging the gap between Supply Chain Management (SCM) and e-Logistics. The STLC’s informatisation of is presented in several phases. The first phase involves business modelling of the existing business processes of organisations (AS-IS model). The results of the first phase allow an in-depth view of the STLC that is used in the future in the business model setup. Next, TO-BE processes are created which are to be implemented and supported via informatisation. The result of the informatisation project is revealed to be homogenous and transparent business activity between cluster members. The purposes of the STLC’s informatisation are to create a business model, standardise business processes, allow cost cutting and improved business performance, reduce operating times, support asset management, and trace shipments which are the basics of economic competitiveness.

Keywords: SCM, e-logistics, cluster, informatisation, business renovation.

Title of the Paper: Experiences of Implementing a Value-Based Approach


Authors: Pasi Ojala

Abstract: This study tries to advance and define the concepts, principles and practical methods of a value-based approach, involving definition of the essential concepts of value, cost and worth in software development. These definitions originate from the Value Engineering (VE) method, originally applied and developed for the needs of the production industry. Therefore, for the purposes of this study these concepts are firstly justified and secondly defined. In order to study and evaluate the value-based approach, a method called value assessment is developed and used when assessing processes and products. The results of industrial case show that even though there is still much to do in making the economic-driven view complete in software engineering, the value-based approach outlines a way towards a more comprehensive understanding of it. For industrial users value assessment seems to give a practical help for handling cost and profitability related challenges.

Keywords: Software process and product improvement, assessment, value, worth, cost and Value Engineering.

Title of the Paper: A New Version of Flusser Moment Set for Pattern Feature Extraction


Authors: Constantin-Iulian Vizitiu, Doru Munteanu, Cristian Molder

Abstract: The choice of a suitable feature extraction method is essential for the success of the classification or recognition process. This paper proposes a design method for a new pattern descriptor set based on the Flusser moment class, which is invariant to elementary geometric transforms and has an increased robustness to the action of some perturbations. Experimental results based on the use of a real video image database confirm the basic properties of this new descriptor set.

Keywords: Pattern feature extraction, Flusser moment class, pattern recognition, neural network.

Title of the Paper: The Characteristics of Learning in Limited Data and the Comparative Assessment of Learning Methods


Authors: Fengming M. Chang

Abstract: Many studies about learning in limited data were made in recent years. Without double, small data set learning is a challenging problem. Information in data of small size is scarce and has some learning limit. While discussing the learning accuracy in limited data, different classification method causes different results for different data because each classification method has its property. A method is the best solution for one data but is not the best for another. Therefore, this study analyzes the characteristics of small data set learning by the comparison of classification methods. The Mega-fuzzification method for small data set learning is applied mainly. The comparison of different classification methods for small data set learning with several kinds of data is also presented.

Keywords: Small data set, Mega-fuzzification, Machine learning, Classification method, Paucity of data.

Title of the Paper: The Effects of DTAI on Nursing Skills and Self-Efficacy


Authors: Mei-Huang Huang , Aih-Fung Chiu, Ju-Ling Liu

Abstract: This research is to evaluate the effects of Digital Technology Assisted Instruction (DTAI) applied in the Physical-Examining (PE) skill courses for nursing students. Both a "Learning Satisfaction with DTAI" survey and a "Self-efficacy" survey for these courses had been designed as tools in this study. Students of the third grade (five years program) in the nursing department of a technological college in Taiwan were recruited. One hundred and twenty-seven subjects who took the DTAI program on internet were assigned to be the experimental group, while the other 77 subjects with the traditional program to be the control one. All data were analyzed by SPSS 13.0 for Windows. The results were: (1) The practice time after classes, scores of the skill test, and scores of the "Self-efficacy" survey of the experimental group were significantly higher than those of the control group. (2) The average time of online-browsing through DTAI in the experimental group was 122.22 minutes. (3) The average score (5-points scale) of the "Learning Satisfaction with DTAI" survey was 4.37. (4). There were significantly positive correlations between the practice time after classes, scores of the skill test and scores of "Self-efficacy" (p < .001). (5) DTAI for the PE skill courses made a good impression for nursing students. DTAI was indeed a convenient and efficient tool without limitations to time or places, and still made a great improvement in learning effects. Moreover, the students get more confidence in problem-solving and nursing skill learning. Such intervention can be incorporated into the teaching instruction to improve the nursing skills and self-efficacy for the novices. Finally the destinations of teaching and learning can be achieved.

Keywords: Digital Technology Assisted Instruction (DTAI), Learning effects, Nursing students, Physical-Examining Course (PE), Self-efficacy.

Title of the Paper: A New Approach for the Creation of a Non-Profit Website with the Example of a Regional Museum


Authors: Pavel Makagonov, Celia B. Reyes Espinosa

Abstract: In the engineering of Internet applications the main demand is to concentrate on interests of the user. That means that the user is predetermined. In this paper, a new approach is proposed for the case when the stakeholder is not a user and the final user is not sharply defined. This situation is typical for design of a website for a non-profit rural regional museum. The first stages in this case consist of; the creation the user’s model for the regional museum’s website; the study of characteristics of existing successful regional museum sites, the creation of a quick prototype of the site taking into account characteristic features of successful sites; the use of the prototype for monitoring the users’ behavior and organizing direct feedback with them. In this article basic elements of these new first stages are considered and shown by example in the creation of a prototype website for future sites for regional museums in the southeast region of Mexico.

Keywords: regional museum websites, user profiles, access monitoring, sample of successful websites.

Title of the Paper: Wavelets Families and Similarity Metrics Analysis in VIR System Design


Authors: L. Flores-Pulido, O. Starostenko, R. Contreras-Gomez, L. Alvarez-Ochoa

Abstract: This paper presents an analysis of some novel approaches for visual information retrieval (VIR) system design which are based on extraction of wavelet coefficients and applying specific similarity metrics. Four families of wavelets and three techniques for computing similarity between queried and retrieved images have been tested using designed Image Retrieval by Neural Network and Wavelet Coefficients (RetNew) system. The best Symlet transform and similarity metrics based on Euclidian distance have been adopted in a proposed VIR system called Image Retrieval by Wavelet Coefficients (IRWC). Additionally, in order to evaluate a proposed approach and a novel designed system the recall and precision metrics used for analysis of the performance of VIR facilities have been applied on base of the standard COIL-100 image collection. The obtained results show the increment of retrieval efficiency up to 93% without additional increasing a processing time. Therefore a proposed approach may be considered as a good alternative for designing new VIR systems. The obtained results allow facilitating the development of new methods for solving this still open problem of efficient image retrieval.

Keywords: Image Processing, Visual Information Retrieval, Similarity Metrics, Wavelets

Title of the Paper: Fuzzy Model for Estimation of Passenger Car Unit


Authors: Praveen Aggarwal

Abstract: In most of the developing countries including India mixed traffic condition prevail on roads and highways. There is a wide variation in the static and the dynamic characteristics of different types of traffic. The only way of accounting for this non-uniformity for any traffic analysis in traffic stream is to convert all vehicles into a common unit and the most accepted unit for this purpose is passenger car unit (PCU). PCU value for a vehicle is not constant but varies with traffic and roadway condition around. A number of factors have been identifies affecting PCU values. The current study aims at developing a fuzzy based model for the estimation of PCU values for bus. Fuzzy based model is of importance because of a number of independent affecting factors. Results of developed fuzzy MATLAB based model are compared with the quoted results and are found with high degree of correlation.

Keywords: Passenger Car Unit, fuzzy model.

Title of the Paper: Using Data Mining to Provide Recommendation Service


Authors: Ruey-Shun Chen, Yung-Shun Tsai, K.C. Yeh, D.H. Yu, Yip Bak-Sau

Abstract: This research introduces personalized recommendation service into library services. Using the borrowing record of the library as basis, the association rules of data mining technique are used to look for book association by focusing on reader’s borrowing mode, personal interest and trait in order to simplify the complexity of recommendation structure. The Bayesian network concept is used to build up a personalized book recommender system in order to generate different book recommendations, ranking from high to low, to help reader to locate book information most suitable to his requirement. Meanwhile we use user satisfaction questionnaire to understand the accuracy of recommended books and further to feedback information in order to help the post learning of Bayesian network parameter. This is for the perfection of the overall structure of recommender system so that readers could make use of the resource of the library more effectively and the value of the library system could be further improved.

Keywords: Recommendation System, Bayesian Network, Data Mining.

Title of the Paper: Optimal Run Time for EMQ Model with Backordering, Failure-In- Rework and Breakdown Happening in Stock-Piling Time


Authors: Yuan-Shyi Peter Chiu, Shun-Sheng Wang, Chia-Kuan Ting, Hsien-Ju Chuang, Yu-Lung Lien

Abstract: This study examines the optimal run time for the economic manufacturing quantity (EMQ) model with failure-in-work, backlogging, and random breakdown happening in stock-piling time. A recent article by Chiu and Chiu [Mathematical modeling for production system with backlogging and failure in repair, Journal of Scientific & Industrial Research, 65 (2006) 499-506] investigated optimal lot-size for EMQ model with backordering and failure-in-rework. By incorporating random machine breakdown-another inevitable reliability factor into their model, this research examines its effects on the optimal run time and on the long-run average costs. Mathematical modeling and cost analysis are employed and the renewal reward theorem is used to cope with variable cycle length. Convexity of the expected production- inventory cost function is proved. An optimal replenishment policy that minimizes overall costs is derived for such an unreliable system. Numerical example is provided to show its practical usage. Managers in the field can adopt this run time decision to establish their own robust production plan accordingly.

Keywords: Manufacturing, Run time, Breakdown, Failure-in-rework, Backorder.

Title of the Paper: An Optimized Pseudorandom Generator using Packed Matrices


Authors: Jose-Vicente Aguirre, Rafael Alvarez, Leandro Tortosa, Antonio Zamora

Abstract: Most cryptographic services and information security protocols require a dependable source of random data; pseudorandom generators are convenient and efficient for this application working as one of the basic foundation blocks on which to build the required security infrastructure. We propose a modification of a previously published matricial pseudorandom generator that significantly improves performance and security by using word packed matrices and modifying key scheduling and bit extraction schemes. The resulting generator is then successfully compared to world class standards.

Keywords: Pseudorandom Generator, Stream Ciphers, Binary Matrices, Cryptography, Security.

Title of the Paper: Understanding Participant Loyalty Intentions in Virtual Communities


Authors: Echo Huang, Meng-Hsiang Hsu, Yu-Ren Yen

Abstract: This study proposes a conceptual model based on expectation confirmation theory with extended antecedent variables (human assets, technical assets, and complementary assets) to examine the impact of virtual communities’ assets on member satisfaction and perception of usefulness. The members of JavaWorld@TW, a representative professional community in Taiwan, were chosen to participate in the survey, and 235 usable responses were collected in three months. Partial Least Square(PLS) regression were used to test the model, the findings show that satisfaction is the strongest predictor of revisiting, following perceived usefulness. Furthermore, higher confirmation of human assets, technical assets, and complementary assets are accompanied with higher positive satisfaction and perceived usefulness which influence the intention to revisit. We also advanced to explore the difference within groups, weak-tie members and strong-tie members were separated for further examination. The difference in the behavioral models of weak-tie and strong-tie, particularly in the revisit context, shed light on the importance of development related theories that can be applied to shape the post-use behavior of specific groups. Implications are proposed in the final section.

Keywords: Resource-Based Theory, Expectation Confirmation Theory, Revisit, Virtual Communities, human assets, technical assets, complementary assets.

Title of the Paper: Characterization of Imaging Phone Cameras Using Minimum Description Length Principle


Authors: Adrian Burian, Aki Happonen, Mihaela Cirlugea

Abstract: In this paper, a new Minimum Description Length (MDL) approach for the characterization of a mobile phone’s color camera is presented. The use of high-order polynomials, Fourier sine series, and artificial neural networks (ANN) for solving this problem are compared and contrasted. The MDL formalism is used for determining the stochastic complexity of polynomial and Fourier sine models for the characterization of a Nokia N93 mobile phone camera. A quantitative evaluation of their performances, as well as for using an ANN, is provided.

Keywords: Minimum Description Length, High-Order Polynomial, Artificial Neural Network, Imaging Mobile Phone.

Title of the Paper: The Internet Visit Rate, its Monitoring and Analysis


Authors: Jiri Kohout, Antonin Slaby

Abstract: Currently the Internet visit rate is a topic of increasing importance. It is a tool for monitoring marketing campaigns’ success and comparing the popularity rate to other competitors. This article aims at analyzing the process of measuring the frequency distribution of online customer’s behaviour, highlighting the risks in interpretation of the results and presenting methods used in the data analysis process. Part one outlines a way of collecting data for the analysis by recording them to the web log. Then another method working on the active content principle is presented. The comparison of both methods follows, supported by an overview of advantages and disadvantages. Consequently, the gained findings are summarized and result in the list of essential preconditions necessary for objective and complex analysis of visit rate data.

Keywords: Internet visit rate, Monitoring, Web Server Logs.

Title of the Paper: Real Time Trajectory Based Hand Gesture Recognition


Authors: Daniel Popa, Georgiana Simion, Vasile Gui, Marius Otesteanu

Abstract: The recognition of hand gestures from image sequences is an important and challenging problem. This paper presents a robust solution to track and recognize a list of hand gestures from their trajectory. The main tools of the proposed solution are robust kernel density estimation and the related mean shift algorithm, used in both video tracking and trajectory segmentation. The gesture definition is based on strokes in order to allow the use of a low complexity gesture recognition method. The gesture recognition process is trivial, being reduced to a syntactic analysis of the feature vector avoiding the necessity of complex classification methods based on curve matching. Despite the restrictions derived from the stroke based definition of gestures, the low computational complexity of the algorithm allows its implementation on low-cost processing systems.

Keywords: gesture recognition, mean shift, robust methods, video tracking, human machine interface.

Title of the Paper: MeDiMed - Regional Center for Medicine Multimedia Data Exchange


Authors: Karel Slavicek, Michal Javornik, Otto Dostal

Abstract: Institute of Computer Science of Masaryk University is working on the field of supporting medicine multimedia data transport archiving and processing more than ten years. Since first steps like transport of ultrasound and CT images across private fiber optics network these activities have grown up to regional PACS archive. This paper describes the technology background of the MeDiMed project.

Keywords: PACS, DICOM, medicine multimedia data.

Title of the Paper: Analysis of Retailer Web Sites with Microeconomic Interpretation


Authors: Tomislav Herceg, Bozidar Jakovic, Milivoj Markovic

Abstract: The competition in the Croatian retailing business is on its peak, but is vague when it is going to settle. This situation is a great field for new market opportunities and finding new niches might be of a crucial importance. Leaders and followers switched their places, as well as the CEOs and marketing policies. Although the fight with the prices is still very important, its detrimental effect on the profits attracts retailers to differentiate their products by several means, one of which may be the e-retailing. Even though it accounts for a certain share of the sales in the World, it is yet to be set in Croatia. The speed of the implementation of broadband connections and new companies on the communications market decreased the price and increased the accessibility of the Internet. It is now on the retailing companies to recognize this opportunity and occupy this newly created area of business. The comparison between 18 greatest retailers in Croatia and in the World, using the previously devised Web-metric and methods, brought the following conclusions: Only one company offers a possibility of e-purchase, and many have only the basic information on their web site. However there is some connection between the size of a company and its web site development level. Companies should invest their web sites and develop an e-sales point, including online shops, Web site maintenance, Web design, and user-friendliness, as well as in the measurement of the e-activities. Multimedia content should not be forgotten since it is what makes a site recognizable.

Keywords: Web sites, Retailers, Web metrics, Croatian retail market, Market congestion, Competitive market, Retailing market analysis, E-purchase, Web shop.

Title of the Paper: E-learning and its Application to Microeconomics


Authors: Bozidar Jakovic, Tomislav Herceg, Fran Galetic

Abstract: Distance learning is a growingly important component of education. Its low cost and easily accessible education materials are to play an important role in raising overall education level in the world, since Internet access is growing exponentially. At the same time, it allows better flow of labour because it lets students live much further from the University they attend. It also decreases living costs, as well as the costs of the education institution itself. Microeconomics is a discipline easily adaptable to the e-learning process because its graphical representations and complicated deductions can be easily taught using interactive presentations and simple software. Several authors offer a multiple possibilities for implementation of the eLearning systems. We analyzed some cases in which distance learning offers even greater possibilities than the traditional teaching techniques, using illustrative graphs, thorough analytic step-by-step deductions of main microeconomics connecting the theory and practice through the use of statistic and econometric programmes and methods. The final aim of this paper is to introduce more efficient, more comprehensive and cheaper teaching method.

Keywords: Microeconomics, 3d graphs, e-learning, education, distant learning, multimedia materials, interactive presentation, teaching techniques, Web 2.0

Issue 5, Volume 5, May 2008

Title of the Paper: Error Measurements and Parameters Choice in the GNG3D Model for Mesh Simplification


Authors: Rafael Alvarez, Leandro Tortosa, Jose F. Vicent, Antonio Zamora

Abstract: In this paper we present different error measurements with the aim to evaluate the quality of the approximations generated by the GNG3D model for mesh simplification. The first phase of this method consists on the execution of the GNG3D algorithm, described in the paper. The primary goal of this phase is to obtain a simplified set of vertices representing the best approximation of the original 3D object. In the reconstruction phase we use the information provided by the optimization algorithm to reconstruct the faces thus obtaining the optimized mesh. The implementation of three error functions, named Eavg,Emax,Esur, allows us to control the error of the simplified model, as it is shown in the examples studied. Besides, from the error measurements implemented in the GNG3D model, it is established a procedure to determine the best values for the different parameters involved in the optimization algorithm. Some examples are shown in the experimental results.

Keywords: Surface simplification, mesh reconstruction, error approximations, neural networks, growing neural gas, growing cell structures.

Title of the Paper: Enhancing Software Projects Course Work by Advanced Management


Authors: Jyhjong Lin

Abstract: Software projects are an important course in software engineering curricula. They provide students with an opportunity to gain valuable experiences in applying the theoretical materials learned at software engineering courses. However, many drawbacks in current projects course work make these benefits difficult to realize. In this paper, we discuss these drawbacks and how they affect the effectiveness of this course. To address these difficulties, we advocate a model of enhancing the management approach to provide an effective administration for each project. Our experiments from the model have shown significant improvements in the quality of projects and the experiencing effects of students.

Keywords: software engineering education, software project, management, model.

Title of the Paper: Algorithm MONSA for All Closed Sets Finding: Basic Concepts and New Pruning Techniques


Authors: Rein Kuusik, Grete Lind

Abstract: In this paper an algorithm named MONSA for closed sets mining is presented. It does not use such kind of techniques as in ChARM by Zaki and Hsiao. MONSA is an exact depth-first search algorithm extracting only frequent closed sets using several new very effective pruning techniques to be free from repetitive and empty patterns. MONSA does not depend on the initial order of objects. In MONSA there is active only one branch which is under construction. The purpose of this paper is to describe the approach used in MONSA and the correspondence of its basics and concepts to the approach by Zaki and Hsiao. A full example of the algorithm’s work is presented. By the algorithm the intersections (closed sets) and IF…THEN rules on the subsets of source data set simultaneously are found. MONSA treats not only binary data, but a larger set of discrete values.

Keywords: Data mining, Frequent closed sets, Pruning techniques, Depth-first search, Monotone systems.

Title of the Paper: Business Intelligence Systems: A Comparative Analysis


Authors: Carlo Dell’aquila, Francesco Di Tria, Ezio Lefons, Filippo Tangorra

Abstract: A set of evaluation criteria is described and considered for comparing some popular OLAP systems that support Business Intelligence. These criteria involve critical aspects such as: information delivery, system and user administration, and OLAP queries. The measurement method is based on the functional complexity analysis. Experimental results have been carried out using a data warehouse in academic environment and they allow to evidence the weaknesses and the points of force of each compared system.

Keywords: Data warehouse, Data mart, OLAP system, Functional size measurement, Business Intelligence. platform.

Title of the Paper: Feasibility of Implementing B2B e-Commerce in Small and Medium Enterprises


Authors: Tien-Chin Wang, Ying-Ling Lin

Abstract: E-commerce has substantially affected the business world in the recent, and its importance is expected to continue increasing in future. Since implementing B2B e-commerce in small and medium enterprises (SMEs) is a long-term commitment and such enterprises are more limited in terms of resources than large enterprises, the predicted value of successful implementation is extremely useful in deciding whether to initiate B2B e-commerce. This investigation establishes an analytical hierarchy framework to help SMEs predicting implementation success as well as identifying the actions necessary before implementing B2B ecommerce to increase e-commerce initiative feasibility. The consistent fuzzy preference relation is used to improve decision making consistency and effectiveness. A case study involving six influences solicited from a Taiwanese steel company is used to illustrate the feasibility and effectiveness of the proposed approach.

Keywords: B2B e-commerce; small and medium enterprises; multi-criteria decision making; consistent fuzzy preference relation.

Title of the Paper: Improving Query Performance in Virtual Data Warehouses


Authors: Adela Bâra, Ion Lungu, Manole Velicanu, Vlad Diaconiţa, Iuliana Botha

Abstract: In order to improve the quality of Business Intelligence Systems in an organization we can choose to build the system using BI techniques such as OLAP and data warehousing or by using traditional reports based on SQL queries. The cost and developing time for BI tools is greater than those for SQL Reports and these factors are important in taking decisions on what type of techniques we used for BIS, also the problem of low performance in data extraction from data warehouse can be critical because of the major impact in the using the data from data warehouse: if a BI report is taking a lot of time to run or the data displayed are no longer available for taking critical decisions, the project can be compromised. In this case there are several techniques that can be applied to reduce queries’ execution time and to improve the performance of the BI analyses and reports. In this paper we present an overview of an implementation of a Business Intelligence project in a national company, the problems we confronted with and the techniques that we applied to reduce the cost of execution for improving query performance in this decisional support system.

Keywords: Tuning and optimization, SQL query plans, Business Intelligence projects, Virtual data warehouse, Data extraction, Query optimization and performance, Partitioning techniques, Indexes, Analytical functions.

Title of the Paper: Automatically Extracting Important Sentences from Story based on Connection Patterns of Propositions in Propositional Network


Authors: Hideji Enokizu, Satoshi Murakami, Moriaki Kumasaka, Kazuhiro Uenosono, Seiichi Komiya

Abstract: In recent years, the world is filled with a large amount of information through the internet and so on. Such a situation increasingly enhances the worth of the automatic text summarization which can support a quick grasp of the text content. The automatic text summarization has so far been accomplished by extracting some important sentences from a text based on various surface cues. To be compared, we tried to devise a new method to extract the important sentences from the story according to the way in which the people comprehend it. In designing this new method, we took account of the text comprehension model, that is, how people comprehend the story text. Then we devised the procedure for transforming from a set of the propositions to the propositional network. In Experiment 1, the participants were asked to select the sentences regarded as important from five stories. Then we examined how the propositions drawn from each important sentence were connected in the propositional network of each story. As a result, we identified three distinctive connection patterns. In Experiment 2, it was examined whether those connection patterns are valid as the rules to extract the important sentences from five new stories. From the sentences extracted our system and the important sentences selected by the participants, we calculated the aggregation accuracy measures. As a result, it was found that they were clearly higher than the baselines. Moreover they were equal to or higher than ones obtained in the previous researches. This finding was replicated to the stories used in Experiment 1.

Keywords: Automatic Text Summarization, Extracting Important Sentences, Extended Rules of Connecting Propositions, Propositional list, Propositional network, Connection Patterns.

Title of the Paper: Performance Evaluation of the Software Visualization tools and A New Framework to Manage Cognitive Load in Computer Program Learning


Authors: Muhammed Yousoof, Mohd Sapiyan, K. Ramasamy

Abstract: Cognitive load experienced while learning programming is very high due to the high element of interactivity and poor instructional design. Prior researchers [2][5][6][10] have focused to minimize the load such as program visualization, pair programming etc. There are many computer based tools available in the market which are aimed to reduce the cognitive load experienced by the learners and to facilitate faster learning. In this paper, two such tools such as JELIOT, BlueJ are taken into consideration. We evaluate the effectiveness of each of the above mentioned tools in their ability of easing the learning process. An experiment was conducted among the students of Computer Science at Dhofar University to evaluate the effectiveness of each tool to reduce the load experienced by the learner. The impact of these measures is not determinable since there is no mechanism to monitor the load and thus the results of the previous studies are very subjective. We also propose a framework which consists of 3 layers, to help in managing the load by monitoring .When the load exceeds the capacity, the instructional design could be altered or customized to enable the learning. The proposed framework is a novel way to ease the learning process of computer programming.

Keywords: Cognitive Load, Jeliot, BlueJ, Physiological Measures, Galvanic Skin Response.

Title of the Paper: Use of Wavelet-based Basis Functions to Extract Rotation Invariant Features for Automatic Image Recognition


Authors: Santiago Akle, Maria-Elena Algorri, Ante Salcedo

Abstract: In this paper we explore the use of orthogonal functions as generators of representative, compact descriptors of image content. In Image Analysis and Pattern Recognition such descriptors are referred to as image features, and there are some useful properties they should possess such as rotation invariance and the capacity to identify different instances of one class of images. We exemplify our algorithmic methodology using the family of Daubechies wavelets, since they form an orthogonal function set. We benchmark the quality of the image features generated by doing a comparative OCR experiment with three different sets of image features. Our algorithm can use a wide variety of orthogonal functions to generate rotation invariant features, thus providing the flexibility to identify sets of image features that are best suited for the recognition of different classes of images.

Keywords: Rotation Invariant Features, Zernike Moments, Haar Wavelets, Daubechies Wavelets, Orthogonal Functions, OCR.

Title of the Paper: A Protocol For Self-Organizing Peer-to-Peer Network Supporting Content-Based Search


Authors: Igor Mekterovic, Mirta Baranovic, Kresimir Krizanovic

Abstract: For a peer-to-peer(P2P) content sharing network holding large amount of data, an efficient semantic based search mechanism is a key requisite. Semantic based search should generate as little traffic (messages) possible while achieving precision and recall rates comparable to those of correspondent centralized system. In this paper protocols for self-organizing P2P networks that arranges links between peers according to peer's content are developed and tested. Peers organize themselves into "semantic communities" without losing links to other semantic communities. Proposed network requires no prior knowledge of the semantics of documents that are to be shared in the system. Through simulations, it is shown that proposed network is resilient to membership changes and achieves high recall rates.

Keywords: Peer-to-peer, Content-based search, Information retrieval, Algorithm, Semantic.

Title of the Paper: Visual Reinforcement Learning Algorithm using Self Organizing Maps and Its Simulation in OpenGL Environment


Authors: Hiroshi Dozono Ryouhei Fujiwara, Takeshi Takahashi

Abstract: Recently, the camera systems becomes more available for mobile robots . But scene analysis for generating control signals is still difficult and consumes large computational power. For this problem, the control method which generates the control signals directly from the raw camera images will be effective. In this paper, we use the reinforcement learning using the camera image as input data. For the division of the states represented with camera images, self organizing map is introduced. The division of the states and learning of the control signal using reinforcement learning are executed simultaneously on the map. For examining the performance of this algorithm, we made the simulation system with Graphical User Interface using OpenGL.

Keywords: Reinforcement Learning, Self Organizing Map, Learning algorithm, Mobile robot, OpenGL.

Title of the Paper: Numerical Experiments on Pareto-optimal Task Assignment Representations by Tabu-based Evolutionary Algorithm


Authors: Jerzy Balicki

Abstract: Meta-heuristics like evolutionary algorithms require extensive numerical experiments to adjust their capabilities of solving decision making problems. Evolutionary algorithm can be applied for finding solution in distributed computer systems. Reliability and the load balancing are crucial factors for a quality evaluation of distributed systems. Load balancing of the Web servers can be implemented by reduction of the workload of the bottleneck computer what improves both a performance of the system and the safety of the bottleneck computers. An evolutionary algorithm based on a tabu search procedure is discussed for multi-criteria optimization of distributed systems A tabu mutation is applied for minimization the workload of the bottleneck computer. It can be obtained by task assignment as well as selection of suitable computer sorts. Moreover, a negative selection procedure is developed for improving non-admissible solutions. Extended numerical results are submitted.

Keywords: Evolutionary algorithm, Multi-criterion optimization, Distributed systems, Artificial intelligence, Pareto solutions.

Title of the Paper: Nodal Analysis- based Design for Improving Gas Lift Wells Production


Authors: Edgar Camargo, Jose Aguilar, Addison Ríos, Francklin Rivas, Joseph Aguilar-Martin

Abstract: In this work, a gas lift-based oil production wells improvement technique is presented. This technique is based on Nodal Analysis, which is applied to well head level, where the production data are available. Thus, a production model is obtained, representing the production curve. This model allows calculating the production flow and pressure drop relationship that can be found in all the components of the completion system. So, it will be possible to determine the oil or gas flow that can be produced by the well, considering the perforation and completion geometry. With this information, we can build a production optimization system in order to increase the production flow rate.

Keywords: Nodal Analysis, Production Systems, Artificial Gas Lift Well, Production Control.

Title of the Paper: Hand-Written Digits Recognition by Graph matching and Annealing Neural Networks


Authors: Kyunghee Lee

Abstract: Mean field annealing(MFA) is a promising tool in optimization and a neural network model based on the graph matching have attracted attention due to a number of benefits over conventional recognition models. We present a neural network model for hand-written digits recognition using graph matching and two annealing techniques, MFA and one-variable stochastic simulated annealing(OSSA). OSSA makes it possible to evaluate the equilibrium spin average value effectively by Monte Carlo technique. In this paper hand-written digits recognition can be formulated as elastic graph matching, which is performed here by annealing techniques of matching cost function. Our model provides not only the function of recognition but also the segmentation ability such that input characters are correctly recognized and segmented simultaneously even if they are touching, connected, and defected by noise. Some simulation results show the capability of our model and the characteristics of MFA and OSSA.

Keywords: Pattern recognition, Segmentation, Optimization, Mean field annealing, Annealing neural networks.

Title of the Paper: Platform Support for an Intelligent Enterprise Business Management


Authors: Gabriela Rodica Hrin, Lucian Emanuel Anghel, Adrian David

Abstract: The platform IDEA assists all the specific processes of an enterprise, from the meat processing industry, to help decision makers to manage performances by implementing the concepts Business Performance Management (BPM) and Business Intelligence (BI). The platform transforms data into information and then into knowledge being focused on business, technological and economical aspects specific to the meat processing enterprises helping them to realise an efficient use of their business policies, financial, human and material resources. The platform integrates the software developed components dedicated to decision processes management, customer relation management and enterprise resources planning components. The platform offers support for an intelligent management of the business processes, of the manufacture flows, and of the enterprise resources. Some tools considered in the development of the platform are business management systems, business workflow analysis, business performance management, OLAP (Online Analytical Processing), data modeling, data visualisation, report servers, AJAX (Asynchronous JavaScript and XML) technology.

Keywords: Business Performance Management, dynamic interfaces, process modeling, data mining.

Title of the Paper: Fuzzy with LabVIEW Software for Reliability Prediction at Nuclear Complex System (NCS)


Authors: Vasile Anghel

Abstract: The reliability level for a nuclear installation is given in generally by the technological process quality of operation and maintenance and in particular by a lot of technical, technological, economic and human factors. The maintenance role is fundamental for a nuclear installation. In the maintenance activity as in any dynamic area, appear continuously new elements which, sometimes, require new methods of approach, thus for considered installation that is a Nuclear Detritiation Plant (NDP) existent as part of National Research and Development Institute for Cryogenics and Isotopic Technologies – ICSI, Rm.Valcea, for assurance the reliability level in operation is proposed for predictive maintenance the theory fuzzy, entropy theory and software LabVIEW. The final aim is to achieve the best practices for maintenance of Plant that process tritium.

Keywords: simulation, nuclear, reliability, maintenance, LabVIEW.

Title of the Paper: Continuous Auditing System Based on Registration Center


Authors: Huanzhuo Ye, Yuning He, Zhuoyuan Xiang

Abstract: With the acceleration of information technologies and the availability of online real-time information systems, a rapidly growing number of organizations are conducing business and publishing business and financial reports online and in real time. Real-time financial reports provided by continuous auditing technologies are likely to necessitate continuous auditing, so what continuous auditing technologies can be utilized to facilitate continuous auditing for the next generation of accounting systems has become very important. As one of emerging information technologies, Web services technology could be seen as a good way to facilitate continuous auditing. Relying on a number of components of Web services technology, we propose a Web-service-based model using a registration center for continuous auditing called the Web-service-based continuous auditing model (WSCAM). This continuous auditing mechanism would run in the auditee’s system and could be applied to provide assurance about specific business processes. In such a model, auditor could confirm specific information with the supplier of accounting materials for validation purpose, and auditee could provide specific financial information to the third party for their transaction. The frameworks and technologies which can support such a Web-service-based continuous auditing mechanism are described. The features are also presented to illustrate WSCAM.

Keywords: Continuous auditing; Web services; XML; XBRL GL, Registration center.

Title of the Paper: Implementation of a Network Sport Training Platform in E-Learning Information System


Authors: Huay Chang

Abstract: In the earlier study paper, the author uses the corresponding information system technologies to present a newly teaching training method. Based on the E-Learning web-site and the network teaching, we use skilled system to establish the interactive Sport Teaching E-Learning structure. The proposed method provides a newly teaching model and an innovative learning point. In this paper, we implement the E-Learning Information System in the Network Sport Teaching E-Learning Platform and obtain some special features. The positive features of the Network Sport Teaching E-Learning Platform includes ‘The convenience of Learning Guide’, ‘The Characteristics of Individual Course Programs’, ‘The Multiple-types of Learning Contents’, ‘The Creation of Sport Groupware’ and ‘The Instant Interactive Phase’.

Keywords: E-Learning, information system, network, skilled system, web-site.

Title of the Paper: Intelligent Monitoring Of Containers - IMC


Authors: Gabriela Rodica Hrin

Abstract: Intelligent monitoring of containers is an innovative and modern system dedicated to container management in multimodal transport being developed in Romania into an national research project. The system offers services for planning and real time survey of the container transport and transported freight. The system is an information system that support activities regarding freight mobility being dedicated to goods providers, goods buyers, brokers, road transport operators, railway transport operators, naval transport operators, air transport operators, assurance agencies, container owners, The technologies that have been used and integrated during system development are .Net, GPS (Global Positioning System), GIS (Geographical Information System), GSM (Global System for Mobile Communications).

Keywords: Goods multimodal transport, container monitoring, transport planning and surveying, Net, GPS, GIS, GSM.

Title of the Paper: Project Management Stage Mutations within Agile Methodological Framework Process Transformations


Authors: Evangelos Markopoulos, Javier Bilbao, Eugenio Bravo, Todor Stoilov, Tanjia E. J. Vos, Carlo Figa Talamanca, Katrin Reschwamm

Abstract: Projects are living entities. They are born with the project idea and end with the project termination. The time in between can last for many years, in most of the cases, requiring continuous implementation and management effort. Over the time project changes on requirements change the implementation process and in turn the management process, the maintenance process and so on. In order to maintain qualitative and quantitative project results both the project implementation and management processes need to be adjusted in the overall project changes and environment. This adjustment can be made by using agile project management methodologies, defining processes based on the identification of the project goals, constraints and expatiations. Unfortunately that is not enough, and projects, especially software projects are still in implementation and management crisis. This paper presents the concept of process mutation on project management methodological frameworks as a supplementary method to the agile models and agility the concept.

Keywords: Project Management, Process Mutation, Agility, Process Framework.

Title of the Paper: Integrating Computer Aided Design and Computer Aided Process Planning: A Computational Techniques Model Approach


Authors: Ionel Botef

Abstract: One of the most daunting challenges in Computer Integrated Manufacturing (CIM) is bridging the gap between Computer Aided Design (CAD) and Computer Aided Process Planning (CAPP). Past research into CAPP, considered one of the most important and most complicated computer aided systems, resulted in a wealth of knowledge but unresolved problems still exist. The actual CAPP systems are considered large, complex, and monolithic, with limited extensibility, low-level of integration with other applications, and high development and maintenance costs. Consequently, this paper focuses on a computational technique model for CAD/CAPP integration. Supported by authorities, evidence or logic, it is demonstrated that a limited number of important design and manufacturing features can be used to achieve an integrated product model that provides not only a direct interpretation of CAD data to the CAPP system, but supplies sufficient information for the generation of the correct process plan’s operations sequence. The approach simplifies engineering drawing’s information complexity, and offers better computability, reusability and improved communication between CAD and CAPP. As a result, the approach is used to develop software applications that apply object-oriented programming as a new way of thinking about solving CAD/CAPP problems and as a promising alternative to other techniques.

Keywords: CAD, CAPP, Computational Technique

Title of the Paper: UML Data Models From An ORM (Object-Role Modeling) Perspective. Data Modeling at Conceptual Level


Authors: Daniel Ioan Hunyadi, Mircea Adrian Musan

Abstract: This paper provides an overview of Object-Role Modeling (ORM), a fact-oriented method for performing information analysis at the conceptual level. It provides both a graphical and textual languages and a procedure which guides the use of the languages. The article is structured in two main parts. The first part presents an overview of ORM along a real example, while the second part of the article makes a comparison between ORM and UML from the conceptual data modeling perspective. This paper examining data modeling in the Unified Modeling Language (UML) from the perspective of Object Role Modeling (ORM). It provided some historical background on both approaches, identified design criteria for modeling languages, and discussed how object reference and single-valued attributes are modeled in both. It compared UML multi-valued attributes with ORM relationship types, and discussed basic constraints on both, as well as instantiation using UML object diagrams or ORM fact tables. This third issue compares UML associations and related multiplicity constraints with ORM relationship types and related uniqueness, mandatory role and frequency constraints. It also contrasts instantiation of associations using UML object diagrams and ORM fact tables.

Keywords: Object-Role Modeling (ORM), FORML (Formal Object-Role Modeling Language), ER diagrams, CSDP, abstraction mechanism, semantic stability, semantic relevance, formal foundation.

Title of the Paper: Two Integration Flavors in Public Institutions


Authors: Vlad Diaconita, Iuliana Botha, Adela Bara, Ion Lungu, Manole Velicanu

Abstract: Integration within public institutions is useful in better aligning the IT with the core processes, but also helps the various parts of the business work with each other better, enabling important business strategies like straight-through processing, improved public service through singe-view-of-customer portals, business activity monitoring and higher data quality. Portals and SOA can help this integration occur. In time, portals have evolved to meet the integration needs of companies. Even though not taken very seriously, they have slowly become leaders in turning new principles into practical experiences. In the beginning, portals focused on aggregating, organizing, and indexing unstructured data, but modern portals now do much more. A portal is a point of integration, useful to the organization by integrating internal business processes and by offering information to the outside world. The increased adoption of business process management (BPM) and Service-Oriented Architecture (SOA) initiatives are both driving portal usage.

Keywords: Portals, SOA, Web services, public institutions, dynamic reports.

Title of the Paper: Conceptual Framework on Risk Management in IT Outsourcing Projects


Authors: Syaripah Ruzaini Hj Syed Aris, Noor Habibah Arshad, Azlinah Mohamed

Abstract: Outsourcing is becoming a trend nowadays. Malaysia also takes this opportunity and embraces in IT outsourcing. As a result, Malaysia has been ranked as the third most attractive destination for outsourcing after India and China. Despite increasing number of organizations that involve in IT outsourcing, it should be noted that IT outsourcing is not a panacea. It comes together with risks. The risks, if not managed, will lead to outsourcing failure. Even though other areas have adopted risk management as their patching material, the application of risk management in IT outsourcing was not quite accomplished. Risk management should be conducted in IT outsourcing as it will foresee risks that might disturb the smooth flowing of IT outsourcing and prevent or reduce the impact of risks if they occur. It should be conducted at early stage and should be continuously performed until the end of outsourcing life cycle. This paper presents a conceptual framework to manage risk in IT outsourcing. The framework will cover the process in risk management in IT outsourcing as well as the risk management principle that should be conducted at each and every phases of IT Outsourcing life cycle. A set of questionnaire was distributed to organizations to validate the conceptual framework. The findings showed that the consequences of not practicing risk management would result in poor controlling and managing of IT outsourcing projects. Based on the findings, future empirical and exploratory survey will be conducted and risk management in IT outsourcing framework will be developed.

Keywords: Risk Management, Malaysia, Analysis of Decision to outsource, Selection of Service Provider, Contract Management, On-Going Monitoring.

Title of the Paper: Computing and Modeling for Crop Yields in Burkina Faso Based on Climatic Data Information


Authors: Yu-Min Wang, Seydou Traore, Tienfuan Kerh

Abstract: The crop yields deficit under rainfed condition is attributed to cumulative effects of low precipitation and inappropriate cropping calendar in Burkina Faso which is located in a dry tropical climate. An efficient use of agricultural water for gaining better productivity in African semi-arid region has been widely suggested based on a cropping calendar approach. Therefore, a suitable cropping calendar could be determined by using the relationship between water and yields in order to have better water management and crop output. This paper aims to model the crop water balance analysis by applying the climatic data information collected from 1995 to 2006 to a reference model and rainfall contribution index recently developed for Ouagadougou and Banfora of Burkina Faso. Also, after the analysis, the crop water yield function concept is used for establishing climatic data information model to examine the final output under different planting dates. From the results, it was found that the difference between potential and expected yields was causally affected by the planting dates applied. In addition, by comparing the maximum expected yields to the average values, the yields were reduced between 5 to 18% and 4 to 23% in Ouagadougou and Banfora, respectively. The difference is small when the planting dates are closer to the established suitable dates. The suitable cropping calendar determined using the model in this study should therefore be used to alleviate water shortage and yield deficit under rainfed condition. Finally, a low water consuming crop species coupled with suitable planting dates could be recommended for agriculture water management in African semi-arid region.

Keywords: Reference model, rainfall contribution index, climatic data information, modeling, yields, planting dates, water management

Title of the Paper: The Users Perceptions and Opportunities in Malaysia in Introducing RFID System for Halal Food Tracking


Authors: Norman Azah Anir, Nasir Mohd Hairul Nizam, Azmi Masliyana

Abstract: The lack of information presented on the food packaging of specific food product usually lead to confusion, thus redundant unsold goods are stacked-up in the shopping market. Barcode, labels and ingredients information by far is not adequate to authenticate the validity of the food information claimed by the manufacturer or food producer. This long-established approach fails to inform the users and no longer fitting in this cyber world. Much work has been carried out to find the best solution to ensure information presented on the food packaging is true yet legitimate. In similar cases by big food store, shows the agility of RFID has assist them to better track their food status. By taking this example, we have carried out some studies to better understand the capability of RFID in tracking Halal status in Malaysia food market. This study is aimed to understand the Malaysian users’ perception on implementing a real-time tool in order to feed users with genuine and validated information in the user-buying process. Furthermore, by conducting this study, researcher can better understand and identify the market opportunities to deploy such technology to the Malaysia users. A quantitative approach was chosen to gather data from the users around Klang Valley and Kuala Lumpur, Malaysia. A survey form consisting of 32 questions were distributed to 50 to 60 identified users, where respondents were selected varies from Halal and non-Halal users. Graphs and tables are presented to depict the findings of users’ perception on the RFID tag for Halal Tracking in Malaysia. The results show 48% of the users agreed that a real-time system is required for the information dissemination. However, sadly only 34% knows what RFID is and what RFID can do in developing a real-time system for Halal tracking. Approximately 98% of the respondent agrees that a new tracking system is required for information traceability. This result shows clear opportunity to introduce new tools, nevertheless solid awareness activities are required to ensure the success of the new system.

Keywords: Halal information, Halal tracking, RFID, User tracking system.

Title of the Paper: Structured Data Representation Using Ruby Syntax


Authors: Kazuaki Maeda

Abstract: This paper describes Ribbon (Ruby Instructions Becoming Basic Object Notation), a new representation written in a text-based data format using Ruby syntax. The design principle of Ribbon is good readability and simplicity of structured data representation. An important feature of Ribbon is an executable representation. Once Ribbon-related definitions are loaded into a Ruby interpreter, the representation can be executed corresponding to the definitions. Java programs are expected to read/write Java objects to persistent storage-media, or to traverse the structured data. A program generator was developed to create Ruby and Java programs from Ribbon definitions. In the author’s experience, productivity was improved in the design and implementation of programs that manipulate structured data.

Keywords: Data Representation, Structured Data, Domain Specific Languages, Ruby, Java.

Issue 6, Volume 5, June 2008

Title of the Paper: An analysis of the Research on Adaptive Learning: The Next Generation of e-Learning


Authors: Elena Verdu, Luisa M. Regueras, Maria Lesus Verdu, Juan Pablo De Castro, Maria Angeles Perez

Abstract: This study examines the evidence for the effectiveness of adaptive learning and the satisfaction level of students when using this type of learning. It first analyses the different classifications of adaptive learning systems existing in the literature, to focus later on describing some adaptive and intelligent e-learning systems, mainly those included in the groups of Intelligent Tutoring Systems (ITS), Adaptive Hypermedia Systems (AHS) and Intelligent Collaborative Learning systems. Next, the Effect Size (ES) tool is adopted as a standard way to compare the results of one pedagogical experiment to another. ES is used to analyse the effectiveness of some of the systems previously described, in order to demonstrate that adaptive learning can provide significant improvements in the learning process of students. Secondly, the learners’ opinion is analysed in order to estimate their satisfaction and to know their preferred mode of studying. Finally, a number of conclusions and future trends are discussed.

Keywords: Adaptive learning, E-Learning, Intelligent Tutoring Systems, Intelligent Collaborative Learning, Intelligent Educational Systems, Adaptive Hypermedia Systems, Learning effectiveness.

Title of the Paper: Multimedia Applications and their Benefit for Teaching and Learning at Universities


Authors: Eva Milkova

Abstract: Multimedia applications together with individual approaches within the didactic process substantially influence education. They give us an excellent chance how to support not only demonstrating and visualizing the explained subject matter to be much clearer and comprehensible, but also enable us to prepare such study material for students that optimizes their study habits. The top applications of multimedia are represented by virtual reality. Well organized and fulfilled virtual learning environment has been becoming important part of teaching and learning. On a few examples of multimedia products created by our students on a script given by the author of this paper we discus their benefit when explaining and visualizing the subject matter and testing knowledge of students through several kinds of self-tests and we also mention advantages of the professional virtual learning environment containing such study material.

Keywords: Multimedia study material, virtual learning environment, self-preparation of students, on-line testing

Title of the Paper: Using Formal Concept Analysis to Design and Improve Multidisciplinary Clinical Processes


Authors: Telung Pan, Yunchun Tsai, Kwoting Fang

Abstract: Recent changes in health care have focused attention on new tools for planning and managing clinical processes. Clinical process employ a concept long used in other industries: the explicit design and documentation of a process. However, the knowledge driven perspective is seldom used when modeling or redesigning a process. Using formal concept analysis, we propose a knowledge management perspective to provide a method for modeling and designing a new multidisciplinary clinical process, in which different medical specialist coordinate the treatment of specific groups of patients and improve the medical quality.

Keywords: Multidisciplinary clinical process, Formal concept analysis, Process modeling

Title of the Paper: Analytical Study of Object Components for Distributed and Ubiquitous Computing Environment


Authors: Usha Batra, Deepak Dahiya, Sachin Bhardwaj

Abstract: The Distributed object computing is a paradigm that allows objects to be distributed across a heterogeneous network, and allows each of the components to interoperate as a unified whole. A new generation of distributed applications, such as telemedicine and e-commerce applications, are being deployed in heterogeneous and ubiquitous computing environments. The objective of this paper is to explore an applicability of a component based services in ubiquitous computational environment. While the fundamental structure of various distributed object components is similar, there are differences that can profoundly impact an application developer or the administrator of a distributed simulation exercise and to implement in Ubiquitous Computing Environment.

Keywords: Ubiquitous Computing,, COM, DCOM, RMI, CORBA, SOAP

Title of the Paper: The Design of Interactive Conversation Agents


Authors: Ong Sing Goh, Chung Che Fung

Abstract: Interactive conversation agents or CAs are computer programs or application software designed to simulate conversation with one or more human users in natural language. Providing CAs with knowledge, intelligence and humanoid interface has allowed them to be used in several practical applications. This paper presented the development and its performance of the interactive CAs called Artificial Intelligent Naturallanguage Identity or AINI.

Keywords: Conversation Agents (CAs), Artificial Intelligence (AI), Artificial Intelligent Natural-Language Identity (AINI), Natural Language Processing (NLP)

Title of the Paper: Semantic Approach to Knowledge Processing


Authors: Mladen Stanojevic, Sanja Vranes

Abstract: The processing of semantic information requires the adequate knowledge representation and ability to interpret semantically related knowledge. The majority of present day approaches to knowledge representation and processing are based on symbolic approach, i.e. on describing the meaning of represented domain knowledge and procedural knowledge used to process this domain knowledge. Hierarchical Semantic Form (HSF) implements the semantic approach to semantic knowledge representation and processing, which is not based on naming as a means to describe the meaning of the knowledge, but on semantic contexts that enable an implicit way to define the meaning of represented knowledge, and on simple and complex semantic categories used to interpret the semantics of represented knowledge. HSF facilitates the automatic translation of knowledge expressed in natural language into structured form and vice versa with no loss of information and its processing including natural language understanding, semantic search and question answering.

Keywords: Semantic Knowledge Processing, Knowledge Representation, Connectionist Model, Question Answering

Title of the Paper: Conducting Fuzzy Division by using Linear Programming


Authors: Murat Alper Basaran, Cagdas Hakan Aladag, Cem Kadilar

Abstract: Some approximation methods have been proposed for fuzzy multiplication and division in the literature. Instead of doing arithmetic operations using fuzzy membership functions for fuzzy numbers, parameterized representation of fuzzy numbers have been used in arithmetic operations. The most applied parameterized fuzzy numbers used in many of the research papers are symmetric and asymmetric triangular and trapezoidal fuzzy numbers. In this study, we propose a new approximation method based on linear programming for fuzzy division. In order to show the applicability of the proposed method, some examples are solved using the proposed method and the results are compared with those generated by other methods in the literature. The proposed method has produced better results than those generated by the others.

Keywords: Approximation method; Fuzzy arithmetic; Fuzzy division; Linear programming; Triangular fuzzy number

Title of the Paper: A Graph-Segment-Based Unsupervised Classification for Multispectral Remote Sensing Images


Authors: Nana Liu, Jingwen Li, Ning Li

Abstract: With more applications of multispectral remote sensing images, how to effectively and correctly make automated classification of multispectral images is still a great challenge. Utilizing both spatial contextual information and spectral information can achieve better classification performance. In order to make better utilization of the spatial contextual information, we apply graph model to the multispectral image, and use graph-based segmentation to produce units of pixels for further classification. In this paper, we present an unsupervised approach for multispectral remote sensing image classification with graph-based segment and fuzzy c-means clustering. Our method mainly involves following steps: First, represent image as graph H = (V,E) based on the feature vector of per pixel and the relationships among neighboring pixels, and segment the graph into groups of sub-regions as basic object units using the effective graph segmentation algorithm. Then according to those global feature vectors of sub-regions, the fuzzy c-means clustering is used to obtain the classification map based on these sub-regions. Experiments shows the results by different segmentation scales, and then turn out that the approach proposed in this paper can achieve better accuracy and efficiency.

Keywords: Multispectral image; Segment-based classification; Graph-based segmentation; Fuzzy c-means; Unsupervised classification; Remote sensing images classification

Title of the Paper: Organizational Aspect of Trusted Legally Valid Long-Term Electronic Archive Solution


Authors: Helena Halas, Jan Porekar, Tomaz Klobucar, Aleksej Jerman Blazic

Abstract: Due to increase of electronic business and business process dematerialization organizations are facing today a problem of preserving vast amounts of electronic documents in coherent and trustworthy manner. A large amount of digital documents are produced every day even in small and medium-sized companies. The documents range from simple receipts to complex legal contracts and service level agreements. Many such documents need to be stored and preserved for longer period of time. Some services and technical solutions providing long term proofs of authenticity, integrity and non-repudiation of electronic documents are available on the market today. In order for these technical electronic archiving solutions and services to be successfully adopted by organizations they need to be deployed in a proper operational and organizational manner. Beside this organization needs to establish required operational procedures and to operate in accordance with them to assure that trusted electronic archive is legally valid. In this paper we present the first set of organizational approaches that organizations need to utilize in order to successfully integrate the operational and legal aspects of electronic archiving and to change the business processes accordingly. Following the approach of pattern oriented organizational design we capture the organizational trusted archiving solutions and best practices in the form of patterns, providing the context of the problem, the generic solution captured in the form of organizational diagrams, and preconditions that need to be met by the organization, and dependencies on other patterns are described. Finally the paper presents implementation of the generic solution to different organizations’ contexts and indicates influence of different applications of the pattern to further solution development.

Keywords: electronic archive, long-term preservation, security, legal compliance, durable integrity, durable authenticity, organizational patterns

Title of the Paper: Reinforcing the Concept of Calculating Isotope Pattern Using Theoretical Isotope Generator (TIG)


Authors: Massila Kamalrudin, Soong Hoong Cheng, Azlianor Abdul Aziz, Muhammad Suhaizan Sulong

Abstract: This paper describes how the isotope patterns are calculated for chemists, students, lecturers and researchers using Theoretical Isotope Generator (TIG) integrated from the .NET framework (specifically VB.NET) with the data from the dual databases (Microsoft Access and Microsoft SQL 2000). Accordingly, TIG in terms of its functionality and features is an application developed to gain related information on molecules and their relative intensity for educational purposes (chemists, students, lecturers and researchers). Although various mass spectrometer applications are available as web-based and windows-based systems, the projected software is indeed practical, beneficial and constructive for the academic learners as well for the intellectual researchers for rapidly defining the numbers of the possibility isotope patterns obtainable from a substance(chemical compound) for example methane (CH2). Therefore, TIG with the mathematical application called Cartesian Products is used to obtain the isotope patterns of any atom or molecules. TIG provides extra features than others that are proficient to retrieve the information of the mass (atomic/molecular weight) from the substance specified based from the isotope patterns available. Additionally, TIG is completely with several functionalities such as drawing, normalizing and zooming the generated graph that convey with the molecular information in a number of formats by providing the details of the calculation and molecules.

Keywords: Isotope pattern, Cartesian Products, Calculation, Engine, Isotopic Distribution and Theoretical Isotope Generator (TIG)

Title of the Paper: The Design and Implementation of Background Pass-Go Scheme Towards Security Threats


Authors: L. Y. Por, X. T. Lim, M. T. Su, F. Kianoush

Abstract: Currently, access to computer systems is often based on the use of alpha-numeric. The textual passwords or alpha-numeric passwords have been the basis of authentication systems for centuries. Similarly, it had also been the major attraction for crackers and attackers. However, users tend to face difficulty in remembering a password that is considered as secured password because this type of secured password usually has long string of characters and they appear randomly [14]. Hence, most users tend to create simple, short and insecure passwords. As a consequence, most of the time, the usability level of passwords has not achieved an optimum for a secured password [14]. In order to solve this problem, a new password scheme had been invented, known as Graphical Password System (GPS). Graphical password is an alternative mean of authentication for login intended to be used in place of conventional password; it utilizes images instead of text. In this paper, we discuss the design and intention of our proposed scheme, called Background Pass-Go (BPG). BPG is an improved version of Pass-Go, as it keeps most of the advantages of Pass-Go and achieves better usability. We had analyzed the BPG scheme in terms of 1) how BPG is able to improve other schemes of GPS especially in Pass-Go, 2) how BPG acts as a solution to different types of threats to networked computer systems. We had verified that BPG manages to overcome the shortage of other GPS schemes. Moreover, the BPG also manages to address most of the security threats for the network security system.

Keywords: BPG, Background Pass-Go, Pass-Go, GPS, Graphical Passwords System, Security.

Title of the Paper: Ontology based Framework for Data Integration


Authors: Alberto Salguero, Francisco Araque, Cecilia Delgado

Abstract: One of the most complex issues of the data integration is the case where there are multiple sources for the same data element. It is not easy to generate and maintain the integrated scheme. In this paper we describe a framework which encompasses the entire data integration process. The data source schemas as well as the integrated schema are expressed using an extension of an ontology definition language which allows the incorporation of metadata to support the integration process. The proposed model allows the user to concentrate in modeling the problem itself and not in the issues of dealing with the temporal and the spatial aspects concerning to many of the data sources usually used in the enterprises information systems.

Keywords: temporal model, data integration, ontology, OWL, metadata, enterprise information system.

Title of the Paper: Multiple Linear Regression in Forecast the Number of Asthmatics


Authors: Darmesah Gabda, Zainodin Hj Jubok, Kamsia Budin, Suriani Hassan

Abstract: The objective of this study was to determine the association between the number of asthmatic patients in Kota Kinabalu, Sabah with the air quality and meteorological factors using multiple linear regression. The main eight independent variables with the fourth order interactions were included in the model. There were 80 possible models considered and the best model was obtained using the eight selection criteria (8SC). The result showed that the best model would represent the cause of the rise in the number of asthmatics modeled by M80.23.

Keywords: multiple regression, eight selection criteria, fourth-order interaction, best model, asthma.

Title of the Paper: Comparitive Analysis of Fuzzy Decision Tree and Logistic Regression Methods for Pavement Treatmenr Prediction


Authors: Devinder Kaur, Haricharan Pulugurta

Abstract: Data mining is the process of extraction of hidden predictive information from large databases and expressing them in a simple and meaningful manner. This paper explains the use of Fuzzy logic as a data mining process to generate decision trees from a pavement (road) database obtained from Ohio Department of Transportation containing historical pavement information from 1985 to 2006. Generally there are many attributes in the pavement database and often it is a complicated process to develop a mathematical model to classify the data. This study demonstrates the use of fuzzy logic to generate decision tree to classify the pavement data. Further, the fuzzy decision tree is then converted to fuzzy rules. These fuzzy rules will assist decision-making process for selecting a particular type of repair on a pavement based on its current condition. The fuzzy decision tree induction method used is based on minimizing the measure of classification ambiguity for different attributes. These models overcome the sharp boundary problems, providing soft controller surface and good accuracy dealing with continuous attributes and prediction problems. This method was compared with common logistic regression model for predicting the pavement treatment. The results show that the fuzzy decision method outperforms the logistic regression model by 10%. The fuzzy decision tree method generates the rules, which gives the better understanding of the relationship between the parameters and the pavement treatment prediction.

Keywords: Pavement Management, Classification Ambiguity, fuzzy ID3, Logistic Regression

Title of the Paper: Modeling Reference Evapotranspiration by Generalized Regression Neural Network in Semiarid Zone of Africa


Authors: Seydou Traore, Yu-Min Wang, Tienfuan Kerh

Abstract: This paper investigates for the first time in Burkina Faso, the potential of using an artificial neural network (ANN) for reference evapotranspiration (ETo) estimation. The ANN algorithm generalized regression neural network (GRNN) was selected for its ability to model the ETo from minimum climatic data. The irrigation management in Burkina Faso still faced to climatic data unavailability for estimating the ETo with the recommended Penman-Monteith (PM) equation. Recently, to overcome the climatic data unavailability difficulty, a reference model for Burkina Faso (RMBF) using only temperature as input has been developed for irrigation management purpose in two production sites, Banfora and Ouagadougou. In this study, four alternative methods to PM, including GRNN, RMBF, Hargreaves (HRG) and Blaney-Criddle (BCR) were employed to study their performances in three production sites, Dori, Bogande and Fada N’gourma. The minimum climatic data were set to the maximum and minimum air temperature as input variables collected from 1996 to 2006. The results revealed that, RMBF, HRG and BCR overestimated the ETo and showed poor performance. In addition, GRNN performance was higher than RMBF, HRG and BCR. Finally, wind has been determined as a sensitive parameter in the ETo estimation for the areas studied. Obviously, using GRNN with minimum climatic variables for ETo estimation is more reliable than the other alternative methods. It is possible to estimate ETo by using ANN in semiarid zone of Africa.

Keywords: Evapotranspiration, estimating, GRNN, minimum climatic data, semiarid zone, irrigation management.

Title of the Paper: Development of a Learning Content Management System


Authors: Lejla Abazi-Bexheti

Abstract: Change appears to be the only constant in the field of ICT and what was treated as advanced feature few years ago is today old-fashioned. If dealing with such rapid change in the field is increasingly difficult and complex, it is even more complicated when one tries to simplify the concepts and processes and define the learning system’s model and the features that would contribute to a more effective teaching and learning. As part of the research project team, which aims to develop software for Learning Content Management System at SEE University, we primarily had to select the features that would cover our needs and also comply with the actual trends in this area of software development, and then design and develop the system. In this paper we present the in house development of an LCMS for South East European University, its architecture, conception and strengths.

Keywords: e-learning, LCMS, e-learning systems, conceptual design, architecture, system modules

Title of the Paper: Network Structure Mining: Locating and isolating core members in covert terrorist networks


Authors: Muhammad Akram Shaikh, Wang Jiaxin

Abstract: Knowing patterns of relationship in covert (illegal) networks is very useful for law enforcement agencies and intelligence analysts to investigate collaborations among criminals. Previous studies in network analysis have mostly dealt with overt (legal) networks with transparent structures. Unlike conventional data mining that extracts patterns based on individual data objects, network structure mining is especially suitable for mining a large volume of association data to discover hidden structural patterns in criminal networks. Covert networks share some features with conventional (real world) networks, but they are harder to identify because they mostly hide their illicit activities. After the September 11, 2001 attacks, social network analysis (SNA) has increasingly been used to study criminal networks. However, Finding out who is related to whom on a large scale in a covert network is a complex problem. In this paper we will discuss how network structure mining is applied in the domain of terrorist networks using structural (indices) measures or properties from social network analysis (SNA) and web structural mining research and proposed an algorithm for network disruption. Structural properties are determined by the graph structure of the network. These structural properties are used for locating and isolating core members by using importance ranking score and thereby analyzing the effect to remove these members in terrorist networks. The discussion is supported with a case study of Jemma Islamiah (JI) terrorist network.

Keywords: Networks, Centrality, Dependency, Rank, Influence, and Destabilization.

Title of the Paper: Classification of Personal Arabic Handwritten Documents


Authors: Salama Brook, Zaher Al Aghbari

Abstract: This paper presents a novel holistic technique for classifying Arabic handwritten text documents. The classification of Arabic handwritten documents is performed in several steps. First, the Arabic handwritten document images are segmented into words, and then each word is segmented into its connected parts. Second, several structural and statistical features are extracted from these connected parts and then combined to represent a word with one consolidated feature vector. Finally, a generalized feedforward neural network is used to learn and classify the different styles/fonts into word classes, which are used to retrieve Arabic handwritten text documents. The extraction of structural and statistical features from the individual connected parts as compared to the extraction of these features from the whole word improved the performance of the system.

Keywords: Data mining of Arabic text, Word recognition, Arabic handwriting, Segmentation of Arabic handwritten documents, Feature extraction, Classification, and Retrieval of Arabic handwritten documents

Title of the Paper: Localization of Distributed Data in a CORBA-based environment


Authors: Milko Marinov, Svetlana Stefanova

Abstract: Query processing over distributed and fragmented databases is more challenging than doing so in a centralized environment. In a distributed environment, the DBMS needs to know where each node is located. The main role of data localization layer is to localize the query’s data using data distribution information. We propose an approach to incorporate the artificial intelligence techniques into a distributed database management system (DBMS), namely to extend the core of a distributed CORBA-based environment with deductive functionalities of the query and view services during the process of data localization. The basic principles and the architecture of the software tool are considered. The implementation and class hierarchy of the objectoriented theorem prover which is built in the core of distributed CORBA-based system are also discussed.

Keywords: Distributed systems, Data localization, CORBA-based architecture, Theorem prover.

Title of the Paper: Issues in the Implementation of Software Process Improvement Project in Malaysia


Authors: Mohd Hairul Nizam Md. Nasir, Rodina Ahmad, Noor Hafizah Hassan

Abstract: Software Process Improvement (SPI) became known in the last twenty years. SPI is crucial to augment software process capabilities in software companies to face present demanding and global market. There were numerous published studies in United States, Europe, Australia and North America. Yet, there was still being short of research and published studies on SPI in Malaysia. This research attempts to fill this gap by focusing to analyze the resistance factors that de-motivate the implementation of SPI project specifically software companies operated in Malaysia. This research has been conducted during March 2008 until August 2008 and it used a survey instrument to gather data from 39 companies operated across Malaysia with the total of 251 professionals responded. The findings showed that organizational factors specifically human factors are playing an important role in determining the success of the SPI project. Participation and commitments from all individuals across the organization are also vital and imperative to ensure success for SPI initiative.

Keywords: Software Process Improvement, Resistance Factors, SPI Implementation, SPI in Malaysia

Title of the Paper: A Fuzzy AHP Application on Evaluation of High-Yield Bond Investment


Authors: Chen-Yu Lee, Jao-Hong Cheng

Abstract: The returns and risks of high-yield bond (HYB) lie between the stocks and Treasury bonds. In view of investment opportunities and the rate of return, the advantages of HYB are both lower risks and higher shares. Therefore, HYB has become one of important components in the portfolios. The purpose of this study is to identify critical factors related to the selection of HYB. Primary criteria to evaluate HYB selection are obtained by the literatures survey and applying fuzzy Delphi method (FDM), and then fuzzy analytic hierarchy process (FAHP) is employed to calculate the weights of these criteria, so as to build the fuzzy multi-criteria model of HYB investment. The results indicate a greatest weight on the dimension of economic environment, and three critical evaluation criteria related to HYB selection are: (1) spread versus Treasury, (2) bond callability, and (3) default rate indicator.

Keywords: High-Yield Bond, Portfolio Management, Credit Rating, Financial Failure, Analytic Hierarchy Process (AHP), Fuzzy Delphi Method (FDM), Fuzzy Analytic Hierarchy Process (FAHP).

Title of the Paper: Adapting Software Engineering Design Patterns for Ontology Construction


Authors: Stanislav Ustymenko, Daniel Schwartz

Abstract: In this paper, we present an argument for designing metadata schemata with design patterns. Design patterns are structured descriptions of solutions to some class of problems, and are used extensively in various stages of object-oriented software engineering. We present a use case of collaborative construction of metadata for a digital library. We explore design challenges this scenario presents and then adapt a pattern called Composite from a standard software engineering design patterns reference to address parts of these challenges. Additionally, we propose a new design pattern called History suggested by a collaborative metadata construction scenario and applicable to a wider class of problems in metadata design.

Keywords: Design Patterns, Knowledge Engineering, Object-Oriented Design, Semantic Metadata, Web Ontology

Title of the Paper: Towards a Flexible Tool for Supporting Data Collection & Analysis in Personal Software Process (PSP)


Authors: Mohd Hairul Nizam Md Nasir, Azah Anir Norman, Noor Hafizah Hassan

Abstract: Personal Software Process (PSP) ultimately provides software engineers an excellent framework and practice that can help them to improve the quality of their work, by analyzing their performance statistically and helping them to achieve realistic goals set by them. Besides, PSP offers many benefits to software engineers. However, through findings and studies, it is found that the Personal Software Process adoption problem may be caused by the overhead in data collection, manual execution in data analysis, and inflexibility of process definition. This paper presents in details the factors that influence the PSP adoption problem and explains the need for automated tool to support the adoption of PSP. It is believed that with the highly flexible automated tool support, it can give the flexibility to the software engineers to manage their process definition rather than staying freeze. Other than that, it can minimize the overhead during the data collection and data analysis phases. Software engineers should be easily monitors, measure and improve their software development process by using other additional features provided by this tool.

Keywords: Personal Software Process, PSP, Automated Tool, Software Process, Flexible Tool.

Issue 7, Volume 5, July 2008

Title of the Paper: Modeling and Development of the Real-time Control Strategy For Parallel Hybrid Electric Urban Buses


Authors: Yuanjun Huang, Chengliang Yin, Jianwu Zhang

Abstract: This paper proposes a feed-forward control model for SWB6105 parallel hybrid electric urban bus (PHEUB) by using Matlab/Simulink. In order to optimize the fuel economy, balance the battery state of charge (SOC), and satisfy the requirements of the vehicle performance and drivability as well, a logic threshold torque distribution control strategy (LTTDCS) incorporating with an instantaneous optimization algorithm was developed for the PHEUB. Bench tests results for the key components of the hybrid powertrain are selected as reliable references in the modeling. The control strategy is validated by the simulation results of the engine, the motor and the battery in terms of fuel economy and deviations of battery SOC.

Keywords: Parallel hybrid electric urban bus (PHEUB), Hybrid powertrain, Hybrid system modeling, Logic threshold torque distribution control strategy (LTTDCS), Instantaneous optimization algorithm, Real-time control

Title of the Paper: Combinatorial Effect of Various Features Extraction on Computer Aided Detection of Pulmonary Nodules in X-ray CT Images


Authors: Noriyasu Homma, Kazunori Takei, Tadashi Ishibashi

Abstract: In this paper, we propose a new method for computer aided detection of pulmonary nodules in X-ray CT images to reduce false positive rate under high true positive rate conditions. An essential part of the method is to extract and combine two novel and effective features from the original CT images: One is orientation features of nodules in a region of interest (ROI) extracted by a Gabor filter, while the other is variation of CT values of the ROI in the direction along body axis. By using the extracted features, pattern recognition techniques can then be used to discriminate between nodule and non-nodule images. Simulation results show that discrimination performance using the proposed features is extremely improved compared to that of the conventional method.

Keywords: Computer aided diagnosis, Lung cancer, Pulmonary nodules, Feature extraction, Image recognition, X-ray CT images

Title of the Paper: Multi-Grid Background Pass-Go


Authors: L. Y. Por, X. T. Lim

Abstract: Computer security depends largely on passwords to authenticate the human user for access to secure systems. Remembering the password they have chosen is a frequent problem for all users. As a result, they tend to choose short and insecure passwords as compared to secure passwords which usually consist of a long mixture of random alphanumeric and non-alphanumeric characters. Thus, the tendency of choosing insecure passwords has brought up many security problems. Graphical password is an alternative to replace alphanumeric password in which users only have to click on the images in order to authenticate themselves rather than typing alphanumeric strings. The main objectives of this paper are to present a classification of graphical passwords system (GPS) and identify its future research area. In this paper, we attempt to identify a number of threats to the networked computer systems, focus on the research of graphical password system (GPS) and analysis on some aspects of GPS; 1) how each GPS algorithm works, 2) the advantages and disadvantages of each GPS algorithm, 3) how each GPS algorithm is able to address solutions to the threats. Besides, the paper also concentrates on the design and the implication of a proposed prototype, namely Multi-Grid Background Pass-Go (MGBPG) which is targeted to be its strength and the winning edge over other graphical password systems. The preliminary result and analysis of the proposed prototype is then presented by comparing it on its role in addressing the drawbacks of current existing GPS and several security attacks. Finally, we highlight a few aspects, which need to be improved in the future to overcome the deficiencies of previous GPS methods that have been invented.

Keywords: GPS, Graphical Password System, Background Pass-Go, BPG, Multi-Grid Background Pass-Go, MGBPG, Password, Authentication, Threat.

Title of the Paper: Learning Schemes in Using PCA Neural Networks for Image Restoration Purposes


Authors: Ion Rosca, Luminita State, Catalina Lucia Cocianu

Abstract: Image restoration methods are used to improve the appearance of an image by application of a restoration process that uses a mathematical model for image degradation. The restoration can be viewed as a process that attempts to reconstruct or recover an image that has been degraded by using some a priori knowledge about the degradation phenomenon. Principal component analysis allows the identification of a linear transformation such that the axes of the resulted coordinate system correspond to the largest variability of the investigated signal. The advantages of using principal components reside from the fact that bands are uncorrelated and no information contained in one band can be predicted by the knowledge of the other bands, therefore the information contained by each band is maximum for the whole set of bits. The multiresolution support set is a data structure suitable for developing noise removal algorithms. The multiresolution algorithms perform the restoration tasks by combining, at each resolution level, according to a certain rule, the pixels of a binary support image. The multiresolution support can be computed using the statistically significant wavelet coefficients. We investigate the comparative performance of different PCA algorithms derived from Hebbian learning, lateral interaction algorithms and gradient–based learning for digital signal compression and image processing purposes. The final sections of the paper focus on PCA based approaches for image restoration tasks based on the multirezolution support set as well as on PCA based shrinkage technique for noise removal. The proposed algorithms were tested and some of the results are presented and commented in the final part of each section.

Keywords: dimensionality reduction, image restoration, shrinkage functions, neural networks

Title of the Paper: Knowledge Management in the HR Sector of RD Organizations


Authors: Valentina Janev, Ana Dokic, Marija Minic, Sanja Vranes

Abstract: This paper presents a solution for managing the specific HR assets in research organizations, i.e. the scientific track record of scientists and researchers. The solution has already been installed at the Mihajlo Pupin Institute (MPI) in Belgrade as a part of the knowledge management initiative started a few years ago. It is based on SAP Human Capital Management (HCM) solution that ensures standardized integration and automation of HR activities and provides a wide range of reporting options. However, as standard SAP HCM solution does not cover the specific aspects of the R&D business process, the additional functionalities were built upon the standard SAP system in order to keep an extensive record of the employee professional and scientific life including scientific and professional skills and expertise, degrees and certificates obtained, information about engagements in concrete projects with details about their roles and competences, scientific achievements (patents, technical solutions, scientific papers / books), other achievements / awards, etc. The new functionalities were build taking into consideration the recommendations of Serbian Ministry of Science for compilation of the Researchers file. The new HCM solution is used for analysis and reporting indoors, as well as towards Serbian Ministry of Science and Statistical Office of the Republic of Serbia. Moreover, CVs of researchers are also kept in several standard forms (European Commission, World Bank, Microsoft, etc), facilitating preparation of international project proposals and bidding material. In this paper we discuss the implementation aspects of the solution, as well as the lessons learned and the benefits gained.

Keywords: knowledge management, scientific track record, R&D organization, SAP HCM, ontology, integrated KM platform

Title of the Paper: A Lightweight Buyer’s Trust Model for Micropayment Systems


Authors: Samad Kardan, Mehdi Shajari

Abstract: In this paper we present a trust model for an enhanced version of MR2 micropayment scheme. We named this scheme TMR2. TMR2’s light-weight trust model is based on user polling, sales volume and vendor’s reputation. We claim that TMR2 can solve the mind barrier problem, which will result in the expansion of micropayment usage. The new token proposed to handle trust is a certificate called Rating certificate. We propose a new Rating certificate for merchants to support users’ trust. Our proposed purchase process needs validation of this certificate, therefore, compared to MR2, the proposed scheme needs only one extra on-line digital signature validation. This validation adds less than 4 percent computational overhead in the purchase process, which is acceptable, considering the benefits that users will gain.

Keywords: User Trust Model, MR2 Micropayment Scheme, Micropayment Schemes, Computational Trust.

Title of the Paper: Correlational Analysis of Layered Superconducting Cuprates


Authors: Vlad Grigore Lascu, Lidia Petrova, Cristina Zarioiu, Anca Novac

Abstract: The paper is a review based on a three level correlational analysis of the behaviour of one compound, of a homologous series and a third interseries multiparametrical analysis, revealing multiple correlation between the critical temperature value and different bond lengths versus the oxygen content. It is described an experimental setup for Tc measurements. A unifying structural scheme of all layered superconducting cuprates is proposed.

Keywords: Superconductivity, Layered cuprates, Homologous series, Correlational analysis, Mercury compound.

Title of the Paper: Analytical and Numerical Results for Detecting Attractors


Authors: Mirela-Catrinel Voicu

Abstract: In this paper we present some qualitative and quantitative results for a particular type of k-order exchange rate models. These results concern the existence of attractors: the fixed point, its stability and its attraction domain, period-p cycles, limit cycles and chaotic attractors. Given the nonlinear nature, the dynamics of these systems cannot be detected using only analytical tools. For this reason, in the last section, we make numerical simulations and we present some examples. The algorithms implementation is made using VBA (Visual Basic for Applications) program in Excel, and the images of the figures in this paper are made using Mathematica.

Keywords: Nonlinear system, Attractors, Numerical Simulations, Programming, VBA & Excel.

Title of the Paper: Line Detection Techniques for Automatic Content Conversion Systems


Authors: Costin-Anton Boiangiu, Bogdan Raducanu

Abstract: In an image processing application there is often need to identify and extract different morphological elements such as characters or lines. This paper studies one general method of identifying vertical or horizontal lines. However, the techniques described here can be used to detect other analytical objects like circles, or even ellipses. Line Detection is an important add-on to an automatic content conversion system which builds digital documents from scanned papers. After identifying lines, other layout elements can be extracted: columns, paragraphs, tables, headers. The present paper is a study of the Hough Transform for which several new enhancements are introduced.

Keywords: Automatic content conversion, Line detection, Edge detection, Hough transform, Feature extraction

Title of the Paper: 3D Mesh Simplification Techniques for Image-Page Clusters Detection


Authors: Costin-Anton Boiangiu, Bogdan Raducanu

Abstract: Entity clustering is a vital feature of any automatic content conversion system. Such systems generate digital documents from hard copies of newspapers, books, etc. At application level, the system processes an image (usually in black and white color mode) and identifies the various content layout elements, such as paragraphs, tables, images, columns, etc. Here is where the entity clustering mechanism comes into play. Its role is to group atomic entities (characters, points, lines) into layout elements. To achieve this, the system takes on different approaches which rely on the geometrical properties of the enclosed items: their relative position, size, boundaries and alignment. This paper describes such an approach based on 3D mesh reduction.

Keywords: automatic content conversion, document digitization, layout identification, entity clustering, mesh reduction, heightmaps, terrain, level of detail

Title of the Paper: Internet Congestion Control Model


Authors: Gabriela Mircea

Abstract: In this paper, we consider an Internet model with n access links, which respond to congestion signals from the network, and study the bifurcation of such a system. By choosing the gain parameter as a bifurcation parameter, we prove that a Hopf bifurcation occurs. The stability of bifurcating periodic solutions and the direction of the Hopf bifurcation are determined by applying the normal form theory and the center manifold theorem. Finally a numerical example is given to verify the theoretical analysis.

Keywords: Internet model, Hopf bifurcation, feedback delay, numerical simulation

Title of the Paper: Blending Implicit Shapes Using Fuzzy Set Operations


Authors: Qingde Li, Jie Tian

Abstract: Implicit modelling is a powerful technique to design geometric shapes, where a geometric object is described by a real function. In general, the real functions used in implicit modelling are unbounded and can take any values in space R. In general, the shapes described by different level sets of an unbounded implicit function can be varied significantly and are very unpredictable. In addition, allowing the underlying implicit function to take negative values also makes the construction of shape blending operations a difficult task. In this paper, we propose an implicit shape modelling technique, where each implicit shape is represented as the membership function of a fuzzy set which is bounded and nonnegative with value taken in [0, 1]. The most obvious benefit of representing an implicit shape as fuzzy set is that the blending of a set of implicit shapes is simply a problem of aggregating a set of fuzzy sets, which can be done in various ways by choosing a proper fuzzy set aggregation operator from a wide variety of fuzzy set operations.

Keywords: Implicit curves and surfaces, isosurfaces, blending operations, Generalized algebraic operations, Piecewise algebraic operations, Fuzzy sets, Soft computing

Title of the Paper: Identifying Web Strategies for Large-Scale Firms


Authors: Tuncay Bayrak

Abstract: The World Wide Web has become increasingly important in helping businesses stay competitive. Hence, in the current business environment, it is essential for large firms in established industries to take advantage of the Web and all it has to offer in order to stay competitive. Integrating the Web into business processes may enable companies to operate more effectively and efficiently while doing the same things that they have always done before. While the Web has transformed many industries, organizations need to make conscious and informed decisions on what aspects of the Web they believe will positively impact their organizations. Our goal is to provide a framework in addressing issues of how to use the Web as a platform to develop various strategies. This research will address various web strategies that can be used by large organizations to support various business functions.

Keywords: Internet, Strategy, Web, Organizational Goals, Integration.

Title of the Paper: A Method for Modeling Service Management of e-Learning


Authors: Jyhjong Lin

Abstract: For the rapid advances of Internet technologies and Web applications in recent years, providing opportunities to learn outside of the traditional classroom-based education has gained many attentions as a new theme for prospect learners to acquire knowledge in a more convenient way. In this new paradigm of the so called electronic learning (e-Learning), many efforts have been made to build Web based learning systems that manage desired e-Learning processes. From the managerial perspective on education, this means that each desired e-Learning process is monitored and controlled for fulfilling an expected learning objective. In this paper, we propose an object-oriented modeling method that addresses this issue by dividing required mechanisms into three layers: learning objective, learning service agent, and learning service composition ones. With this architecture, e-Learning processes are managed via the recognition of a learning objective, the employment of a learning service agent that arranges a process of demanded learning services for achieving the objective, and the confirmation of interactions/coordination among these services in achieving the objective. For specification, an object-oriented model is presented for each layer that describes the working detail of that layer. To illustrate, these models are applied in the fulfillment of an e-Learning plan for learning about Software Engineering that involves a set of learning objectives to be achieved by various processes of learning services.

Keywords: e-Learning, service management, object-orientation, modeling method

Title of the Paper: A Study for Comparative Evaluation of the Methods for Image Processing Using Texture Characteristics


Authors: Mariana Jurian, Ioan Lita, Florentina Enescu, Daniel Alexandru Visan

Abstract: In this paper a comparative study has been made regarding certain methods of image processing by the texture characteristic, to find the optimum method for detecting color texture. The study is based on the experiments. There are two main points: the quality of the detection and the response time. For the experiments have been used the co-occurrence matrix and the iso-segments matrix of the gray or color levels. These methods are based on processing the image at a pixel level and the making of matrix that contain certain spatial positions of pixels. The resulting matrix was analyzed and, based on the existing information, the characteristics vectors associated to the matrix was determined. Each method was studied by analyzing the pixels of the image files in certain situations and directions and at certain distances. The co-occurrence methods had better results for the queries based on water, wood and grass textures and on the ones based on sand, ruble and clouds, better results were obtained using the iso-segment matrix.

Keywords: texture, co-occurrence, iso-segments, pixel, propinquity, grain, contrast, directionality, regularity, roughness, line-likeness.

Title of the Paper: An Efficient Stream Mining Technique


Authors: Hatim A. Aboalsamh, Alaaeldin M. Hafez, Ghazy M. R. Assassa

Abstract: Stream analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Stream data is a sequence of observations collected over intervals of time. Each data stream describes a phenomenon. Analysis on Stream data includes discovering trends (or patterns) in a Stream sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In our study, we emphasis on the use of data mining techniques on data streams, where mining techniques and tools are used in an attempt to recognize, anticipate and learn the stream behavior with different directly related or looked unrelated factors. Targeted data are sequences of observations collected over intervals of time. Each sequence describes a phenomenon or a factor. Such factors could have either a direct or indirect impact on the stream data under study. Examples of factors with direct impact include the yearly budgets and expenditures, taxations, local stocks prices, unemployment rates, inflation rates, fallen angels, and rising odds for upgrades. Indirect factors could include any phenomena in the local or global environments, such as, global stocks prices, education expenditures, weather conditions, employment strategies, and medical services. Analysis on data includes discovering trends (or patterns) and association between sequences in order to generate non-trivial knowledge. In this paper, we propose a data mining technique to predict the dependency between factors that affect performance. The proposed technique consists of three phases: (a) for each data sequence that represents a chosen phenomenon, generate its trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future factor sequences.

Keywords: Data Mining, Stream Mining, Time Series Mining, Mining Trends, Data Sequences, Association Mining, Maximal Trend Patterns, Global Trends, Local Trends.

Title of the Paper: Study on Pseudo Update of Building Shape in Road Ledger Digital Map


Authors: Tsuyoshi Takayama, Akitsugu Oki, Hidemi Fukada, Yoshitoshi Murata, Nobuyoshi Sato, Tetsuo Ikeda

Abstract: Recently, in the area of GIS(Geographic Information System), lots of researchers are attracted to efficient update of a digital map data. However, its update cost in the conventional methods is not low. On the other hand, in the case of carrying out some works in a municipal government by employing GIS, the following problem occurs. That is, it is not easy to carry out them because the shape or ‘existence or not’ of a building on a base map of road ledger generated from GIS is not always relevant to its truth. In our discussion, we limit the scope to building shape. The purpose of our research is to develop an efficient and available update method of building shape with satisfying the following two conditions: (i) it does not contain a data structuring phase from raster images and the update cost in it is not too high, and (ii) it use a map updated more frequently than the basic map which should be updated, and is practically available. Authors investigate semi-automatic and pseudo update of a building by employing an urban design map which has higher update frequency but less precision than the base map of road ledger. In here, ‘pseudo’ means that we put the major interest to carry out the municipal government works and provide the first priority with the relevance to the truth on the shape or ‘existence or not’ of a building, even if we sacrifice its precision of the map. In the present paper, we propose its concrete algorithm and evaluate it. According to the evaluation, our proposition has achieved 85.0 % recall and 91.9 % precision. Our proposition has also obtained good result in qualitative evaluation.

Keywords: GIS(Geographic Information System), update, frequency, recall, precision, evaluation, and municipal government work.

Title of the Paper: Constructing a Data Schema from an Information Flow Model


Authors: Junkang Feng, Sufen Wang

Abstract: The ‘information content’ of a data schema is concerned with the capacity of a database in representing the information that the database is designed to provide. It is recognized to be ‘difficult to define and measure’. Our literature survey seems to show that this is an unsolved problem and the difficulties seem to lie with the lack of separation of information and data, and particularly with intuitive treatment of information. We examine what is required for solving this problem. We propose an approach to information and information flow for conceptual data modeling by drawing on a set of contemporary theories concerning the semantic aspect of information. With this approach, we formulate an information flow model from human purposeful activities from which to construct a data schema. This way it can be sure that the data schema represents required information, and therefore the latter is definitely in the ‘information content’ of the former. We observe that this constitutes a possible solution to this problem, and it also represents a ‘semantic information theoretic’ approach to conceptual data modeling. This work is a result of a substantial study of this problem including several real world case studies.

Keywords: Information content, Conceptual modeling, Database design, Requirements engineering, Human purposeful activity


Issue 8, Volume 5, August 2008

Title of the Paper: Ontology used in a E-Learning Multiagent Architecture


Authors:Daniel Hunyadi, Iulian Pah

Abstract: The main goal of this article is to develop a virtual educational environment model which makes learning easier by using collaboration (and extension, team-research model) as a form of social interplay. The model represents a universe where human agents interact with artificial agents (software agents). Considering the vision of the system, it can be classified among advanced systems for it is client-oriented (student) and provides value added educational services, due to the collaborative learning attribute. The model proposes an original architecture where elements of the socio-cultural theory of collaborative learning are assigned to the artificial intelligence components (the multi-agent system). The expected results are: conceptual models (agents, learning and teaching strategies, student profiles and group profiles, communication between agents, negotiation strategies and coalition formation), software entities, and a methodology to evaluate the performance of eLearning systems.

Keywords: human-computer interaction (HCI), multi-agent system, multi-agent architectures, collaborative
learning, artificial agents.

Title of the Paper: Web Image Retrieval Systems with Automatic Web Image Annotating Techniques


Authors: Hsien-Tsung Chang

Abstract: Due to the popularity of digital cameras and web authors’ enriching the visual aesthetics, the number of web images is growing in an uncontrolled speed. The images in the World Wide Web are becoming a large image library for browsing. It is an important issue that how to retrieve the images accurately on the World Wide Web. In this paper we describe the architecture of the web image retrieval systems with automatic image annotation techniques. And we propose four methods to generate the annotation automatically for every image from its hosted web page, by analyzing the structural blocks, collecting anchor text of link structures, and gathering shared annotation with other images with the same visual signature.

Keywords: Image Annotation, Image Retrieval

Title of the Paper: Accident States Simulation. Process Fluids Release


Authors: Cornelia Croitoru, Mihai Anghel, Floarea Pop, Ioan Stefanescu, Gheorghe Titescu, Mihai Patrascu, Ervin Watzlawek, Dorin Cheresdi

Abstract: Seveso II Directive imposes for high hazardous plants quantitative risk evaluation of the major accident. In a general context the risk is defined as product between frequency and consequences of accident state. There are five steps in quantitative risk assessment: identification of significant accident initiating events, development of accident sequences, frequency estimation for accident sequences, computation of post accident events parameters and consequences estimation. In the case of hazardous emissions, post accident events characterization means calculus of flow rates, quantities and duration. The paper presents mathematical models used to describe the process fluids release in emergency states, locally and by safety systems, as well as the results obtained with simulation programs, elaborated for the heavy water concentrations plants based on chemical exchange between water and hydrogen sulphide.

Keywords: Hazard, Risk, Release, Safety system, Simulation, Hydrogen sulphide, Mathematical model

Issue 9, Volume 5, September 2008

Title of the Paper: An Analysis on Taiwan Broiler Farm Prices under Different Chicken Import Deregulation Policies


Authors: Meng-Long Shih, Shouhua Lin, Biing-Wen Huang, Wei-Yu Hu, Chi-I Hsu

Abstract: - According to the changes of chicken import policy, it could be divided into three periods: control import, quota import and free import. This study adopts GARCH models and weekly data to analyze the broiler farm prices during these different periods. Based on the empirical test results, it indicates that there is larger long-run persistence effect of shocks in the free import period. The reactions of broiler farm prices to the lag broiler farm prices, survey number of chick after six weeks, pig prices, colorful broiler farm prices, chick prices and feedstuff prices are different during these three periods.

Keywords: - Import regulation periods, Volatility, Price response, GARCH model.

Title of the Paper:  Matlab-like Scripting for the Java Platform with the jLab environment


Authors:  Stergios Papadimitriou, Konstantinos Terzidis

Abstract: - The jLab environment extends the potential of Java for scientific computing. It provides a Matlab/Scilab like scripting language that is executed by an interpreter implemented in the Java language. The jLab environment combines effectively Groovy like compiled scripting with the interpreted jScript one. A special purpose modification of the Groovy language, called GroovySci is developed for effective compiled scripting. The paper concentrates on the topic of using the jLab scripting engine from within a pure Java application, in order to allow the application to utilize the scientific scripting potential of jLab and its large scientific libraries. The implementation is inspired by the JSR 223 standard but it is much simpler. The same methodology for script invocation can also be used within the Groovy effective compiled scripting framework. We describe the basics of the Groovy scripting environment. To our knowledge, this is the first full Matlab like scientific scripting for Java. The jLab environment is open source and can be downloaded from

Key-Words: - Java, Scripting, Interpreters, Matlab, Scientific Programming, Groovy Scripting.

Title of the Paper:   Environment-Independent Methodology for Accessing External Data Sources


Authors: Laura M. Castro, Victor M. Gulias, Carlos Abalde, Javier Paris

Abstract: - Software engineering is not a static field. Hardware is evolving, and so needs to do software development. Someone walking into a computer store today and buying a personal computer, will most likely end up owning a machine with more than one CPU. And that machine will most likely end up on a network, connected to a lot of other machines and devices. We are talking about paralelism and distribution, two features that threaten to make the software development process harder. To cope with these new parameters, new tools are claiming a place in the vangard of software creation. At the same time, there are also well-known established components, such as our traditional databases, that we still use (and need to use) the same way they have been used for many years. In this article, an environment-independent methodology to combine these two different worlds is be presented, showing that past and future can work together if we properly use abstraction and high-level software engineering tools.

Key-Words: - Software engineering, database access, design patterns, functional programming.

Title of the Paper: Assessing Value of Software Architecture: A Case Study


Authors: Pasi Ojala

Abstract: - During the last decades software architecture has become increasingly important for companies creating competitive product structures. Recently more and more attention has also been focused on the costs, cost-effectiveness, productivity and value of software development and products. This study outlines concepts, principles and process of implementing a value assessment for SW architecture. It outlines also existing possibilities for implementing value assessments. The main purpose of this study is to collect experiences whether the value assessment for product architecture is useful for companies, works in practice, and what are the strengths and weaknesses of using it. This is done by implementing value assessment in a case company step by step to see which phases possibly work and which phases possibly do not work. The practical industrial case shows that proposed value assessment for product architecture is useful and supports companies trying to find value in product architecture.

Key-Words: - Software process and product improvement, Architecture, Assessment, Value, Worth, Cost and Value engineering.

Title of the Paper: A Distributed Virtual M&S Framework for Military Tactical Training


Authors:  SeongKee Lee, ChanGon Yoo, JungChan Park, JaeHyun Park

Abstract: This paper shows a distributed simulation framework for tactical training in Networked Virtual Environment (NVE). These days the NVE technology for an operational tactical training system plays a significant role in military training courses. The existing military training simulation systems are mostly full simulated systems operating on single platform. They are to train individual’s operation skill, but don’t support team level tactical training. In order to train team level combat and command skill in dynamic battlefield, the interaction among distributed combat objects and diverse battlefield composition are required. This paper designs a distributed simulation framework to satisfy these requirements. The framework produces the networked virtual environment using virtual reality, event based simulation and HLA/RTI based interoperation techniques. This paper implements a small scale tactical training system based on the framework.

Key-Words: Modeling and simulation, Simulation Framework, Networked virtual environment, Tactical training.

Title of the Paper: Efficient and Secure Protocol in Fair Certified E-Mail Delivery


Authors:  Ren-Junn Hwang, Chih-Hua Lai

Abstract: - An efficient and secure protocol in certified e-mail delivery is proposed in this paper. With the widespread use of public Internet, communication via electronic mail (e-mail) becomes a convenience application instead of traditional manuscript letter. People can easily append his/her digital signature to the email in order to achieve the goal of non-repudiation of origin. However, the evidence of receipt still relies on the willingness of the recipient in the standard e-mail service. Hence, the recipient has no responsible for the received e-mail. In this paper, we present an efficient and secure protocol in fair certified e-mail delivery (CEMD). Our protocol efficiently provides non-repudiation of origin and receipt in the fair manner. In other words, the sender can obtain the irrefutable receipt if and only if the recipient gets the certified e-mail from the sender, otherwise, neither of them. Moreover, the proposed CEMD is efficiently in sending the other mails to the same recipient by using the pre-computation function. As the evaluations of computational cost and communication overhead, our protocol is cost-effective and efficient than other relevant protocols.

Key-Words: - Certified e-mail, Digital signature, Fair exchange, RSA signature, Security.

Title of the Paper: A Simulation Study to Increase The Capacity of a Rusk Production Line


Authors: Seraj Yousef Abed

Abstract: - This study was conducted in a food processing company on its Rusk production line. The goal of the study was to increase the production rate of the line to meet the continuously increasing demand on its product within the existing limited space in the plant. The production line was thoroughly studied and analyzed. Several bottlenecks that were causing sever congestions in different areas on the production line were found. An Arena Simulation model was developed and used to resolve all bottlenecks found on the line and a simulation experiment consisting of seven different scenarios was conducted to search for a good feasible solution to increase the production rate.The changes made in the production line resolved all bottlenecks, improved utilization of all production equipment, eliminated all congestions and most of the queues at all production stations. An increase of about 50% in production and a decrease of 11.4% in average total production time for a box of Rusk in the system were achieved. The capital investment required to implement the new improvements can be paid back in a period of 35 days from the expected profit that would be realized from the additional increased quantity produced. The changes that were made on the production line to achieve the above improvements were adding two new machines, replacing three other old machines, modifying two other machines and decreasing the time of one of the processes, without affecting the quality of the product.

Key-Words: - Production planning, Food processing, Productivity, Simulation models, Business Process Reengineering.

Title of the Paper: Principles for Support of the Business Processes


Authors:  Dzenana Donko, Ismet Traljic

Abstract: - This paper describes basic components and principles for support of the normatively regulated organizational activities. These activities are characterized by precise objective or purpose, participation of actors as role-holders, and norms and rules that govern the performance of these activities. In order to perform normatively regulated activities efficiently and effectively, actors need proper information and documents, but also have to act in accordance to relevant norms and rules. This paper focuses on a particular aspect and modeling of the normatively regulated activities. Particular example of processing claim is described as the formal model. Some aspects of applicability of object view on normatively regulated activities are described as the improvement on the more complex case of procurement activity.

Key-Words: - Normatively regulated organizational activities, object oriented modeling, business processes.

Issue 10, Volume 5, October 2008

Title of the Paper: Determination of Insurance Policy Using Neural Networks and Simplified Models with Factor Analysis Technique


Authors: Yu-Ju Lin, Chin-Sheng Huang, Che-Chern Lin

Abstract: In this paper, we use feed forward neural networks with the back-propagation algorithm to build decision models for five insurances including life, annuity, health, accident, and investment-oriented insurances. Six features (variables) were selected for the inputs of the neural networks including age, sex, annual income, educational level, occupation, and risk preference. Three hundred insurants from an insurance company in Taiwan were used as examples for establishing the decision models. Six experiments were conducted in this study. These experiments were mainly categorized into two phases: Phase 1 (Experiments 1 to 3) and Phase 2 (Experiments 4 to 6). In Phase 1, we used the six features as the inputs of the neural networks. In Phase 2, we employed the factor analysis method to select three more important features from the six features. In Phase 1, Experiment 1 used a single neural network to classify the five insurances simultaneously while Experimental 2 utilized five neural networks to classify them independently. Experiments 1 and 2 adopted the purchase records of primary and additional insurances as experimental data. Experiment 3, however, utilized the primary insurance purchase dada only. In Phase 2, we repeated the similar experimental procedure as Phase 1. We also applied statistical methods to test the differences of the classification results between Phases 1 and 2. Discussion and concluding remarks are finally provided at the end of this paper.

Key-words: - Insurance policy, Neural networks, Back-propagation algorithm, Classification, Factor analysis, Feature extraction.

Title of the Paper: A Case Study on the SCORM-Based E-learning in Computer-Aided Drafting Course with Users’ Satisfaction Survey


Authors: Che-Chern Lin, Jia-Hseng Pan

Abstract: -. In this paper, we propose a case study to compare the learning difference on a Computer added Drafting (CAD) course between traditional learning and e-learning. The learning materials for the e-learning were designed in Shareable Contents Object Reference Model (SCORM) standard. Seventy-four students from a vocational high school in Taiwan attended the experiment. These students were divided into two groups: the control group and the treatment group. The control group was provided with traditional learning in a regular classroom. The treatment group utilized an e-learning platform to conduct learning activities. The experimental results show that the learning performance between the traditional learning and the e-learning on CAD course is not different. Finally, a survey was also conducted to realize the users’ satisfaction of using thee-learning course.

Key-words: - CAD, e-learning, SCORM, Engineering Education, Moodle, User satisfaction.

Title of the Paper: A Study on Internet Usages, Academic Achievements, and the Exploring Capability of Regional Culture Knowledge Using Internet – A Case of Primary School Students in Taiwan


Authors:  Che-Chern Lin, Wen-Shun Chen

Abstract: - In this paper, we present a case study to discuss the internet usages of primary school students and the exploring capabilities of using internet on regional culture course. 226 students of grade five from a primary school in Taiwan were selected as samples. We designed a questionnaire to analyze the internet usages and behaviors of these sampled students including time spending in internet, frequency, location, and reason of using internet, internet activity, and recognition of internet functionality. We designed questions for the regional culture test and conducted experiments to analyze the relationships between the internet usages and the scores of the regional culture test. Furthermore, we analyzed the relationships among students’ backgrounds, academic achievements, internet usages, and the scores of the regional test. Concluding remarks and the suggestions for future studies are also provided at the end of this paper.

Key-words :- Regional culture, Exploring capability, Internet usages, Academic achievement.

Title of the Paper: Systems Modeling on the Basis of Rough and Rough-Fuzzy Approach


Authors:  Jirava Pavel, Krupka Jiri, Kasparova Miroslava

Abstract: In this paper the modeling of the information, economic and social systems is presented. The models are based on the rough sets theory, and the fuzzy and rough sets theory. These models have represented two real information systems, and a system of an internal human population migration. The information systems are represented as a table where every column represents an attribute (a variable, a property). This attribute can be measured or may also be supplied by a human expert. To obtain the necessary data questionnaires were use. To the migration model selected socioeconomic data, indicators, are applied. Economic and demographic indicators that affect size of migration for districts in the Czech Republic are defined. In data pre-processing we focused on different processing of data inputs. It means for all indicators we used selected data discrimination techniques. In the migration models creation phase we deal with a new design of membership function shapes and rule base definition. The classifier models were carried out in MATLAB. Performed experiments have proven the accuracy of the proposed approach.

Key-words: Modeling, rough sets theory, fuzzy sets, information system, internal human migration, evaluation, classification.

Title of the Paper: New Developments for Determinacy Analysis: Diclique-based Approach


Authors:  Grete Lind, Rein Kuusik

Abstract: - Determinacy Analysis (DA) is a method that solves tasks of data mining (it enables to describe by the rules the subset of objects determined by the user). There are different treatments in DA: step by step and DAS-like in algorithmic view, one solution and multiple solutions as a result, additive and non-additive sets of rules in systematic view. Thereat the essence of the method itself does not change, only the rules change. There is a number of lacks in DA base algorithms: they are too labour-intensive (step by step approach) or they find only a limited set of rules (i.e. only one system of rules of many possible systems, in case of DAS). In this paper we show that DA can be reduced to the dialogue finding task that is well-known from the graph theory, we present the prerequisites to take into account and explain how it influences the rules. The dialogue-based DA enables to set up the DA tasks of new type: to find a system with minimal number of rules, to find a system with minimal number of shortest rules (for example). Reducing DA to a dialogue finding task, the basis for the new generation of DA algorithms is founded.

Key-Words: - Determinacy Analysis, Data mining, Rules, Dialogue, Dialogue extracting task.

Title of the Paper: Web Services Research Challenges, Limitations and Opportunities


Authors: Florije Ismaili, Bogdan Sisediev

Abstract – Service Oriented Architecture (SOA) is an architecture style where software components that provide a piece of functionality communicate with each other via message they exchange. Within SOA these pieces are called services. Nowadays, the technology platform most associated with the realization of SOA is Web Services. Web Services have received much interest due to their ability to transcend programming language, operating system, network communication protocol, and data representation dependencies and issues. In this paper we suggest a new XML & Web Services Framework which relies on the Service Oriented Architecture and is a way of reorganizing applications into a set of services.

Key-Words: - SOA, Web Services, Web Services Framework, Research Challenges .

Title of the Paper: iCamp Space - An Environment for Self-Directed Learning, Collaboration and Social Networking


Authors: Tomaz Klobucar

Abstract: - In the paper a learning environment for self–directed learning, collaboration and social networking composed of loosely coupled web 2.0-based educational tools is described. The set of building blocks includes blogs, wikis, social book marking tools, tools for synchronous and asynchronous communication, tools for federated search, tools for management of learning contracts and personal communication, feed aggregators, tools for scheduling appointments, etc. Contrary to monolithic learning environments, such as learning management systems, learners can choose the tools from the set by themselves and compose a personal learning environment according to their preferences. The environment has been validated in teaching process in the context of higher education. Five facilitators and 27 students from 4 countries participated in the trial that was conducted from April 2007 till June 2008. The presented work has been performed within the iCamp project ( from the 6th Framework Programme of EU.

Key-Words: - self-directed learning, iCamp, learning environment, social networking, collaboration, trial.

Title of the Paper: Design of a Real Time Transaction Processing Monitor  (TPM) Benchmark Testbed


Authors: Maria Luisa Catalan, Dennis A. Ludena R., Hidenori Umeno, Masayoshi Aritsugi

Abstract: - The Transaction Processing Monitor (TPM) is the most-used middleware in different e-commerce systems from large enterprises to medium and small businesses available in the internet. Due to its growing popularity, the necessity for a more efficient TPM performance is now the major concern between the developers and researchers. The need for a high-end benchmark platform for a TPM is at present very vital to meet the high performance needs of online transactions. In addition to the performance characteristics of the TPM, we also have to ensure the security of the transactions. In this paper, we perform a detailed analysis of the current software packages available for this application. And therefore, we propose a secure, isolated, and highly configurable environment using the real-time emulation capabilities of NS2 and the virtualization capabilities of Xen, in order to provide a testbed with the characteristics and behavior of a real network.

Key-Words: - Virtualization, Emulation, Networking, Transaction Processing, Benchmark, Modelling

Title of the Paper: Design and Development of a Secure Military Communication based on AES Prototype Crypto Algorithm and Advanced Key Management


Authors: Nikolaos G. Bardis, Nikolaos Doukas, Konstantinos Ntaikos

Abstract: - In this article, a study is presented that aims at the development of a prototype system for the secure real-time exchange of messages between users of workstations connected to the same TCP/IP network. The security is provided based on the AES prototype cryptographic algorithm. An advanced key management scheme is used within this system that enhances the security of the system, reduced the effects of possible security breaches and simultaneously hides from users the unnecessary complexity related to handling multiple encryption keys. The scope of application is military units and is intended to become the basis for the design and development of an integrated framework for the exchange of secure messages between different sites of military or other organizations that are concerned about information security. The present design is limited in its application to local area networks only. There are however no fundamental restrictions and an expansion to wide area networks and the internet is also possible. The design of the application is firstly presented. Problems of security and ease of use that are related to the management of the secret encryption keys are explained. A solution is hence presented for these problems, that is based on an innovative scheme for key storage and management. The design and implementation of the application is presented in detail along with description of its basic functionality. The plans for application and further development of the application are described and conclusions are finally drawn.

Key-Words: - Secure messaging, AES, encryption, key management


Issue 11, Volume 5, November 2008

Title of the Paper: Scientific Programming with an Environment that Combines Effectively Compiled and Interpreted Scripting at the Java Platform


Authors: Stergios Papadimitriou, Konstantinos Terzidis

Abstract: The jLab environment extends the potential of Java for scientific computing. It provides a Matlab/Scilab like scripting language that is executed by an interpreter implemented in the Java language. The scripting language supports the basic programming constructs with Matlab like matrix manipulation operators. The jLab "core" provides the general purpose functionality with an extensive set of built in mathematical routines that cover all the basic numerical analysis tasks. The important advantage of jLab compared to other similar environments is the potentiality to dynamically and automatically integrate Java code to the system in order to obtain both execution speed and to reduce the programming effort. This task is supported both by an easy to use extension Java class wizard and by application specific class wizards that automate the utilization of jLab's scientific libraries. However, the incorporation of external Java general purpose code is not as convenient as the scripting code development is. Also, j-scripting is relatively slow compared to Groovy scripting that operates by compiling the scripts to Java classes. This was the motivation for the adaptation of the general purpose Groovy “scripting SuperJava” language as a parallel and cooperative scripting option in the jLab environment. The paper concentrates on the issues involved in the implementation of the multiscripting environment and on the benefits that can be obtained by the combination of these two very different scripting frameworks. The Groovy agile scripting language for the Java platform is both very flexible and powerful. We describe the modifications to the Groovy language and some of the most basic extensions that we have implemented in order to build the GroovySci language, the compiled scripting language of the jLab platform.

Keywords: Java, Scripting, Interpreters, Matlab, Scientific Programming, Class Loaders, Groovy, Binding

Title of the Paper: Traversal Patterns for Content Designed Web Environment


Authors: Perwaiz B. Ismaili, Richard M. Golden

Abstract: This paper describe new ways of observing effects of content presentation and domain knowledge upon navigation behaviors by designing web (hypertext) presentation format that adheres to content design inspired by discourse and text comprehension literature. More specifically logical connections between web pages at macro level for all web sites were constructed meticulously and kept consistent across all three knowledge domains. Twenty undergraduate Psychology students participated in this preliminary study in investigating domain knowledge and content presentation influence on Hypertext (web) site traversal behavior. Classical data analysis (ANOVA) were used to explore these qualitative phenomena. Contrary to our belief expertise difference were not significant for total number of web pages (nodes) visited or overall time spent on each knowledge domain web sites. However, these differences were significantly strong for super-ordinate nodes, nodes with more semantic (logical) connections and irrelevant nodes.

Keywords: Hypertext, Navigation, User Behavior, Content Design, Expertise, Scientific Text, Web Design

Title of the Paper: Real-Time Face Detection using Dynamic Background Subtraction


Authors: K. Sundaraj

Abstract: Face biometrics is an automated method of recognizing a person’s face based on a physiological or behavioral characteristic. Face recognition works by first obtaining an image of a person. This process is usually known as face detection. In this paper, we describe an approach for face detection that is able to locate a human face embedded in an outdoor or indoor background. Segmentation of novel or dynamic objects in a scene, often referred to as background subtraction or foreground segmentation, is a critical early step in most computer vision applications in domains such as surveillance and human-computer interaction. All previous implementations aim to handle properly one or more problematic phenomena, such as global illumination changes, shadows, highlights, foreground-background similarity, occlusion and background clutter. Satisfactory results have been obtained but very often at the expense of real-time performance. We propose a method for modeling the background that uses per-pixel time-adaptive Gaussian mixtures in the combined input space of pixel color and pixel neighborhood. We add a safety net to this approach by splitting the luminance and chromaticity components in the background and use their density functions to detect shadows and highlights. Several criteria are then combined to discriminate foreground and background pixels. Our experiments show that the proposed method possesses robustness to problematic phenomena such as global illumination changes, shadows and highlights, without sacrificing real-time performance, making it well-suited for a live video event like face biometric that requires face detection and recognition.

Keywords: Face Detection, Background Subtraction

Title of the Paper: Non-linear Estimation Methods for Hematocrit Density based on Changing Pattern of Transduced Anodic Current Curve


Authors: Hieu Trung Huynh, Jung-Ja Kim, Yonggwan Won

Abstract: Many studies reported that the hematocrit (HCT) is the most highly influencing factor affecting the accuracy of the glucose measurements by portable/handheld devices. It is also known as an important factor for clinical decision-making situations. Therefore, estimation of HCT plays a crucial role for enhancing accuracy of glucose measurements and performance of therapy. In this paper, we present novel methods for hematocrit estimation from the transduced current curve which is produced by glucose-oxidase reaction in strip-type electrochemical biosensors. The proposed methods are nonlinear, including neural networks and support vector machine. Input features are composed of two parts: the sampled points of the time-varying current curve and extended extra features computed from those sampled points.

Keywords: Hematocrit, hematocrit estimation, nonlinear methods, biosensors, transduced current curve

Title of the Paper: Using RFID Technology in Food Produce Traceability


Authors: Ruey-Shun Chen, C.-C. Chen, K. C. Yeh, Y.-C. Chen, C.-W. Kuo

Abstract: Food safety events occur frequently because of epizooty. Many countries build food traceability systems to solve these problems. However, the current food traceability system must be executed by paper work and need a lot of manpowers. It also cannot trace and tracking back the origin and destination of food. The fact that RFID technology can trace object, therefore, it can solve these problems. The method of this research integrates RFID technology on the food produce traceability system. Using RFID technology will be easy to trace each object, not only for the goods lots. RFID technology can also record all events automatically and acquire the information about the food production by handhold devices. The result of this paper is providing an integrated tactility system for the entire food supply chain by RFID technology. The benefit of this research can trace the food production, and let consumers get the complete food production information to choose and buy the safety food.

Keywords: RFID, Produce traceability, Food supply chain, Food safety

Title of the Paper: Application of Multifractal Analysis on Medical Images


Authors: Jelena Andjelkovic, Natasa Zivic, Branimir Reljin, Vladimir Celebic, Iva Salom

Abstract: This paper shows results of computer analysis of images in the purpose of finding differences between medical images in order of their classifications in terms of separation malign tissue from normal and benign tissue. The diagnostics of malign tissue is of the crucial importance in medicine. Therefore, ascertainment of the correlation between multifractals parameters and “chaotic” cells could be of the great appliance. This paper shows the application of multifractal analysis for additional help in cancer diagnosis, as well as diminishing. of the subjective factor and error probability.

Keywords: Fractals, Multifractals, Holder exponent, Medical images, Carcinomas, FracLac, FracLab

Title of the Paper: Methodology of Fuzzy Usability Evaluation of Information Systems in Public Administration


Authors: Miloslav Hub, Michal Zatloukal

Abstract: This paper suggests methodology of usability evaluation of information systems in public administration based on fuzzy logic theory. The first part of this paper is devoted to the problem formulation. The following parts of this paper formulate the methodology of usability evaluation aimed to public administration information systems. The authors introduce new ways how to evaluate the user interface with help of vague terms. Fuzzy Usability Evaluator – an application that is able to operate with the vague nature of evaluating is also introduced.

Keywords: Usability, information systems, public administration, fuzzy logic, methodology, software quality, software engineering

Title of the Paper: Optimizing the Minimum Vertex Guard Set on Simple Polygons via a Genetic Algorithm


Authors: Antonio L. Bajuelos, Santiago Canales, Gregorio Hernandez, Ana Mafalda Martins

Abstract: Abstract: - The problem of minimizing the number of vertex-guards necessary to cover a given simple polygon (MINIMUM VERTEX GUARD (MVG) problem) is NP-hard. This computational complexity opens two lines of investigation: the development of algorithms that establish approximate solutions and the determination of optimal solutions for special classes of simple polygons. In this paper we follow the first line of investigation and propose an approximation algorithm based on general metaheuristic genetic algorithms to solve the MVG problem. Based on our algorithm, we conclude that on average the minimum number of vertex-guards needed to cover an arbitrary and an orthogonal polygon with n vertices is n / 6.38 and n / 6.40 , respectively. We also conclude that this result is very satisfactory in the sense that it is always close to optimal (with an approximation ratio of 2, for arbitrary polygons; and with an approximation ratio of 1.9, for orthogonal polygons).

Keywords: Computational Geometry, Art Gallery Problems, Visibility, Approximation Algorithms, Metaheuristics, Genetic Algorithms

Title of the Paper: Fast Algorithm for Detecting the Most Unusual Part of 2d and 3d Digital Images. Application to Large Medical Databases


Authors: Kostadin Koroutchev, Elka Korutcheva

Abstract: In this paper we introduce a fast algorithm that can detect the most unusual part of a digital image. The most unusual part of a given shape is de ned as a part of the image that has the maximal distance to all non intersecting shapes with the same form. The method is tested on two and three-dimensional images and have shown very good results without any prede ned model. The results can be used to scan large image databases, as for example medical databases.

Keywords: Image processing, image statistics, image recognition

Title of the Paper: B-Spline Curve Generation and Modification based on Specified Radius of Curvature


Authors: Tetsuzo Kuragano, Kazuhiro Kasono

Abstract: A method to generate a quintic B-spline curve which passes through given points is described. In this case, there are four more equations than there are control point positions. Two methods have been developed to compensate for the difference between the number of unknowns and that of the equations. These are assuming that the curvatures at both ends of the curve are zero, and assigning four gradients to the given points. In addition to this method, another method to generate a quintic B-spline curve which passes close to given points, and which has the first derivative at these given points is described. In this case, a linear system will be underdetermined, determined or overdetermined depending on the number of given points with gradients. A method to modify a quintic B-spline curve shape according to the specified radius of curvature distribution to realize an aesthetically pleasing freeform curve is described. The difference between the B-spline curve radius of curvature and the specified radius of curvature is minimized by introducing the least-squares method. Examples of curve generation are given.

Keywords: B-spline curve generation, curvature vector, curve shape modification, given points, given points with gradients, underdetermined system, overdetermined system

Title of the Paper: An Analysis of Patterns for Automating Information System Operations


Authors: Matsuki Yoshino, Norihisa Komoda, Michiko Oba

Abstract: Automation of operations is essential for effective and efficient information system operation. For the very first step towards establishing a design methodology for system management operations, operation patterns based upon analysis of existing operations conducted in data centers have been proposed. For each pattern, feasibility of automation and recommendation for improvement is analyzed. The correlation between operation patterns and the objectives of operations has been also analyzed.

Keywords: System management, workflow, patterns, automation, autonomous computing

Title of the Paper: Study about the Process Control of an Electric Arc Furnace using Simulations based on an Adaptive Algorithm


Authors: Manuela Panoiu, Caius Panoiu, Ioan Sora, Anca Iordan

Abstract: The electric arc furnaces are a very large power load, determining the negative effects on the power quality (harmonics currents, unbalanced load, and reactive power). For a maximum efficiency of the power consumption it is necessary to use an automat system for control the harmonics filters, the reactive power compensation installation and the electrodes position, in order to obtain a high value of power factor and a maximum efficiency. In this paper is used an adaptive algorithm for process control for an Electric Arc Furnace. The method is validating using simulation in PSCAD EMTDC software dedicated to Power Systems.

Keywords: Adaptive control, LMS algorithm, active power control

Issue 12, Volume 5, December 2008

Title of the Paper: Interorganizational Partnership, Switching Cost, and Strategic Flexibility in Supply Chain


Authors: Jao-Hong Cheng, Chih-Huei Tang, Huei-Ping Chen

Abstract: Many studies work focusing on the interaction between the various dimensions of supply chain (SC) relationships (such as trust, commitment, satisfaction, investment, communication and collaboration) but far less on the impact of SC relationships on switching cost and strategic flexibility. The partnership interaction between the SC and the strategic flexibility is the most important factor when the manufacturer and subcontractors linked together. However, there has also been a few empirical researches that have examined the impact of switching cost and strategic flexibility. The effect of SC relationships on strategic flexibility has received less attention. As a result, it becomes the key issue in this study. To address these situations, we developed a conceptual framework incorporating dimensions of SC relationships, switching cost, and strategic flexibility. Data is drawn from the survey responses of 202 in Taiwan enterprises. Our findings provide considerable support for our conceptual model.

Keywords: Trust of partnerships, Strategic flexibility, Switching cost, Supply chain

Title of the Paper: A Modified Hopfield Neural Network for Perfect Calculation of Magnetic Resonance Spectroscopy


Authors: Hazem M. El-Bakry, Nikos Mastorakis

Abstract: In this paper, an automatic determination algorithm for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic resonance spectroscopy (MRS) without human intervention or complicated calculations is presented. In such method, the problem of NMR spectrum determination is transformed into the determination of the parameters of a mathematical model of the NMR signal. To calculate these parameters efficiently, a new model called modified Hopfield neural network is designed. The main achievement of this paper over the work in literature [30] is that the speed of the modified Hopfield neural network is accelerated. This is done by applying cross correlation in the frequency domain between the input values and the input weights. The modified Hopfield neural network can accomplish complex dignals perfectly with out any additinal computation steps. This is a valuable advantage as NMR signals are complex-valued. In addition, a technique called “modified sequential extension of section (MSES)” that takes into account the damping rate of the NMR signal is developed to be faster than that presented in [30]. Simulation results show that the calculation precision of the spectrum improves when MSES is used along with the neural network. Furthermore, MSES is found to reduce the local minimum problem in Hopfield neural networks. Moreover, the performance of the proposed method is evaluated and there is no effect on the performance of calculations when using the modified Hopfield neural networks.

Keywords: Hopfield Neural Networks, Cross Correlation, Nuclear Magnetic Resonance, Magnetic Resonance Spectroscopy, Fast Fourier Transform

Title of the Paper: Prediction of the Impact Point for Spin and Fin Stabilized Projectiles


Authors: Dimitrios N. Gkritzapis, Dionissios P. Margaris, Elias E. Panagiotopoulos, George Kaimakamis, Konstantinos Siassiakos

Abstract: A method for real time in flight prediction of the ground impact point of indirect fire projectiles is investigated. The projectiles is assumed to be both rigid (non-flexible), and rotationally symmetric about its spin axis launched at low and high pitch angles. A six degree of freedom projectile model solution is used to propagate the projectile state from an arbitrary point along the trajectory to the ground impact point. The projectiles manoeuvring motion depends on the most significant force and moment variations, in addition to gravity. The developed computational method gives satisfactory agreement with published data of verified experiments and computational codes on atmospheric projectile trajectory analysis for various initial firing flight conditions.

Keywords: Aerodynamic forces and moments, low and high pitch angles

Title of the Paper: New Correlation Analysis Method for Nonstationary Signal


Authors: Zuojin Li, Weiren Shi, Kai Wang, Xiaowei Sun, Zhi Zhong

Abstract: This paper proposes a new correlation analysis method for nonstationary and energy-limited continual signals, which works out a formula similar to time correlation theory; that is to say when the operator, , meets a certain condition, its auto-correlation property of Fourier transform can be presented by Fourier transform of its energy function, f2)(tf, revealing its correlation of the Fourier transform coefficients of frequency interval, ω. It proves that the Parseval identical equation is the special case happening when0=ω. The above conclusion explains the theory of image compressing and noise removing through wavelet transform.

Keywords: Operator, energy function, wavelet transform, correlation function, image process

Title of the Paper: A Sensorless Closing Control for an AC Contactor Based on a New Armature Displacement Estimator


Authors: Chieh-Tsung Chi

Abstract: This paper presents a new approach to solve the contact bounce after contacts closing problem for prolonging the lifespan and improving operation reliability. Based on combining a armature displacement estimator (ADE) with a simple hysteresis controller, such that the coil-current difference between the ac electromagnetic contactor (abbreviated ac contactor) body and the estimator is minimized, the proposed method overcomes the installation of an armature displacement measuring mechanism is indispensable restrictions imposed by previously described closing control methodologies. By using the proposed algorithm, an efficient control configuration can be obtained to reduce bounce duration. Computer simulations, of the proposed method, illustrate lots of benefits that are provided by proposed sensorless closing control configuration.

Keywords: Contact bounce, Estimator, Hysteresis controller, Armature displacement, Sensorless, AC electromagnetic contactor

Title of the Paper: Assessing the SMEs' Competitive Strategies on the Impact of Environmental Factors: A Quantitative SWOT Analysis Application


Authors: Hui-Lin Hai

Abstract: Strength, Weakness, Opportunity and Threat (SWOT) analysis is an established methodology for assisting the formulation of strategy. This paper proposes a new quantified SWOT analytic method incorporated with the vote-ranking method. The indices of SWOT are voted, weighted and quantified to assess the competitive strategy, from top to the bottom, meanwhile the total weighted scores method will be used to get the best strategy alternatives. The competitive strategies of the Taiwanese Small and Medium Enterprises (SMEs) in the Environmental Management Systems (EMS) are taken as a case study, where eighteen certificated ISO9000 or ISO14000 auditors (or lead auditors) are invited to establish a decision group. Under the impact of environmental factors, the results show that company’s image and profitability is the most important strategy for SMEs within the global markets. Lastly, Taiwanese SMEs apperceive the significance of EMS and also recognize the importance to survive within the diversified competing market environment, whereas they need to build up its environmental management that has to suit the EMS specification and attention. The findings are also applicable for other developing countries within the global markets or barriers.

Keywords: Strength, Weakness, Opportunity and Threat (SWOT), Small and Medium Enterprises (SME), Environmental Management System (EMS)

Title of the Paper: Enhancement of Lifespan and Operating Reliability using a New Intelligent Control Method


Authors: Chieh-Tsung Chi

Abstract: This paper presents a bounce-duration reduction method for electromagnetic contactor systems. It combines the experience-based fuzzy algorithm and the hysteresis comparator with simple structure to form a newly fuzzy-hysteresis methodology, which is a valuable for minimizing the bounce duration after contacts closing. The proposed method is implemented in a microcontroller program with a specially designed data structure for obtaining the dynamic spring anti-force and the upper and lower force error limitations of the hysteresis comparator. Tests on the experimental prototype show that the newly proposed method is valid and fast in practical applications.

Keywords: Electromagnetic contactor, Bounce duration, Fuzzy, Hysteresis, Microcontroller, Contacts, Controller

Title of the Paper: A Search Mechanism Based on Ontology Technology for Students in Elementary School


Authors: Dowming Yeh, Su-Ling Hsiang

Abstract: Searching information on the Web has become part of everyone’s daily activity in the modern society. The Web brings much convenience into our lives; however, with the amount of information growing in a staggering speed, how to sieve out useful information and knowledge effectively has become an important skill. In order to obtain a better search result, one needs to know how to form appropriate key words and phrases for search engines. However, elementary school students don’t have sufficient abilities of sub-cognition, and they often form a search string in a natural language which contains few keywords in a long sentence; therefore, the search results do not often satisfy their need. This study addresses this problem by introducing an agent to parse the search string, align synonyms, sift out keywords, and include hyponyms in the intended field. Knowledge in a specific learning unit is represented with Ontology technology and transformed into keywords and associated structure. This study also integrates Google API and establishes specific function characteristics to eliminate unrelated files to conduct the search. The result shows that the prototype system can provide pupils with more accurate results.

Keywords: Search engine, empirical study, natural language, agent, ontology

Title of the Paper: Comparative Genome Sequence Analysis by Efficient Pattern Matching Technique


Authors: Muneer Ahmad, Hassan Mathkour

Abstract: Genetic Sequences of different species contain precious biological information. This information is hidden in the order of appearing nucleotide base characters (A-Adenine, T-Thymine, G-Guanine and C-Cytosine), this order is definitely variant in different organisms but one can conclude some of the similarities and differences in nature, habits and living of species by comparing the biological information contained in sequence.
In this paper, we are presenting an algorithm that provides approximate comparative match between any input strands. It will overcome the draw backs and short comings in prevailing techniques. It becomes most difficult to find approximate match when Genome Adoptive Points (GAPS) are present in the input sequences, this algorithms tries to handle the complex situations and finds the number of approximate matches for optimal results.

Keywords: Limitation Check, Upper Bound, Lower Bound, Page Size, Position Specification, Counters

Title of the Paper: Coding of Checksum Components for Increasing the Control Reliability of Data Transmission for Military Applications


Authors: Nikolaos Bardis

Abstract: Computer Systems used in Military and other challenging applications, are often exposed to increased levels of electromagnetic radiation. Embedded systems falling in this category often suffer from this exposure due to the operation of the device to which they belong. Consequently data communications within such devices need to be protected against transmission errors. Current general purpose encoding schemes that are used in other communication applications are prohibitively complex for this application. In this paper an innovative extension to the well known checksum concept is proposed that is capable of controlling errors in intra device data transfers. The new technique is shown to be simple enough for implementation and to increase the probability of detection of errors by several orders of magnitude. The scheme is hence shown to be suitable for embedded computing platforms for military and other demanding systems. More specifically, the modified checksum is examined in respect of its suitability for use in schemes for the transient error detection in schemes for reliable data storage within computing systems and it is explained why it is extremely suitable for this application. The modified checksum is also considered in the context of algorithm based fault tolerance schemes and it is again concluded that it can contribute to the overall scheme efficiency and effectiveness. The modified checksum is hence shown to be an algorithmic tool that can significantly contribute to the design of reliable and fault tolerant computing systems, such as the ones used for military systems or other applications that operate in adverse environments.

Keywords: Checksum, differential Boolean transformations, error detection, data integrity, fault tolerance

Title of the Paper: A Tool for Military Officers Enchasing Life Long Learning Applied on the Paradigm of Risk Preparedness and Management


Authors: Nikolaos V. Karadimas, Nikolaos Doukas, Nikolaos P. Papastamatiou

Abstract: In this paper model for managing the Job Rotation of personnel attached to specific military units is proposed. This model aims to maintain the level of the overall preparedness of the units against known risks, by maintaining the presence of skilled personnel present in the unit at any time. The model is used for the development of a software tool that manages the systematic training of army officers in risk prevention and crisis management. The aim of this tool is to globally optimize the movements of army officers between units and training centers so that no units are ever left without skilled staff trained for a particular type of contingency. While officers are in training, they prepare for new hazards and prepare emergency plans for dealing with these particular hazards with the support of more experienced trainers or mentors. When this rotation ends, officers return to their units and assume responsibilities in both implementing the plans they have helped create, as well as in disseminating the knowledge and skills they have acquired and training more of the unit’s staff in being able to conduct appropriate parts of the newly introduced emergency procedures.

Keywords: Risk Preparedness, Job Rotation, Training Sessions, Military Units, Military Applications, Web Services, Grid Architecture

Title of the Paper: Information Spillover Effects of IPOs using 2SLS


Authors: Jao-Hong Cheng, Huei-Ping Chen

Abstract: There are several Initial Public Offerings (IPOs) regulatory changes implemented in Taiwan stock markets since 2005. One of those new mechanisms is to release the stock IPOs pricing rules to underwriters. The underwriters and listed companies have been growing up rapidly in recent years, and the whole offer size has become full-scaled. However, foreign scholars found that pioneer companies have many uncertainties in IPOs due to there isn’t any information about offer price and proceeds to refer. When companies want to decide the appropriate offer price and proceeds, they have to spend a lot of time and money on information gathering. If there existed information spillover, and the information would be acquired from the other companies which had finished IPOs. Therefore, we try to research whether there existed information spillover in Taiwan IPOs market using 2SLS and Probit to analyze. By empirical results, there exist positive information spillover in offer price and listing price revision. For underwriters, underwriter rank is based on underwriter reputation, and the underwriters which have better rank usually couldn’t get the high initial return.

Keywords: Initial Public Offerings (IPOs), information spillover, underwriting reputation, Two-Stage Least Squares (2SLS)

Title of the Paper: A Brand Position Model for the Resort Hotels at Kenting Area in Taiwan


Authors: Tsuen-Ho Hsu, Hsiu-Hui Cheng, Ching-Cheng Shen

Abstract: Under the considerations for the rapid development in leisure tourism and the demand increase for resort hotels, most enterprises will meet a pressure of drastic competition. Therefore, they will intend to build a unique brand in the mind of consumers and to enhance competitiveness and increase their profits. In this study, we will address the issue on the position of brand equity and key factors of competition for the resort hotels. The questionnaire method will be applied to collect the data and the resort hotels in Kenting area will be taken as an illustrative example in our study. The primary findings of this study can be summarized as follows:
1. Customers will have a higher degree of recognition for the environmental quality, local quality and uniqueness brand to the resort Hotels in Kenting area.
2. According to the brand equity diagram derived by using the multidimensional scaling (MDS) to those tourists at kenting area, we can find out that those resort hotels can be clustered into four clusters.
From above findings, it can provide the useful reference about the marketing strategy and differentiation competition for the resort hotels in Kenting area. Besides, we also construct a brand equality recognition model by using fisher’s discriminant function technique. It will aid the necessary analysis since the enterprises would like to realize their position on the competition environment via the brand equity recognition.

Keywords: Brand Equity, Resort Hotels, Multidimensional Scaling (MDS), Discriminant Function

[Journals], [Books], [Conferences], [Research], [Contact us], [Indexing], [Reports], [E-Library], [FAQ],
[Upcoming Conferences with Expired Deadline], [History of the WSEAS Conferences]

Copyright © WSEAS