WSEAS CONFERENCES. WSEAS, Unifying the Science

Main Page of the Journal                                                           Main Page of the WSEAS


 Volume 6, 2009
Print ISSN: 1790-0832
E-ISSN: 2224-3402








Issue 1, Volume 6, January 2009

Title of the Paper: High Efficient Knowledge Extracting Platform Based on E-Delivery Pattern


Authors: Meiqun Liu, Kun Gao

Abstract: Generally, the existing Data Mining delivery mechanisms are based on moving data to a mining server node or moving data mining code to data resource node. This pattern is not suit for the new requirement for some situations. In this paper, we proposed a novel knowledge extraction mechanism which based on E-delivery pattern. The new mechanism can solve efficiently some problem coming from nodes having data resources but lack of computing power, or some node with computing power but no data resource and low transmission bandwidth. The kernel of the system we proposed adopts OGSA as the foundation and provides open service for systematic expansion. We also discuss the feasibility and the scalability in this paper.

Keywords: Software delivery, Knowledge Discovery, Knowledge Extracting, OGSA

Title of the Paper: High Efficient Scheduling Mechanism for Distributed Knowledge Discovery Platform


Authors: Meiqun Liu, Kun Gao

Abstract: Distributed data mining plays a crucial role in knowledge discovery in very large database. Since the distributed knowledge discovery process is both data and computational intensive, the Grid is a natural platform for deploying a high performance data mining service. The key issue for distributed data mining Grid system is how to scheduling data mining tasks in a high efficient way. In this paper, we propose a novel and efficient mechanism which is based on decomposing and mapping data mining tasks to DAG, and ordering them according the respective execution cost. The results show that this mechanism is scalable and feasibility.

Keywords: Grid Computing, Data Mining, Tasks Scheduling

Title of the Paper: Research on E-business Intelligent Examination System


Authors: Xin Jin

Abstract: Commonly, the traditional E-business education and training adopts two ways: the theoretics teaching in classroom and the computer-based application operation in E-business simulation environment which is like an E-business web site (but not same as). But it is difficult to check students’ ability of E-business application operation. In this paper, we will discuss an intelligent examination system that is for checking the examinee’s ability of E-business application operation. This paper presents the design and implementation of an intelligent examination system for checking the E-business application operation capable of the examinees. It proposes a novel architecture for on-line examination system which is based on E-business workflow. That system adopts the common client-server pattern with two major parts, Student site and Teacher site. The Student site provides examinee a GUI (Graphics User Interface) embedded a WWW browser for answering the paper. The Teacher site mainly includes paper management subsystem for managing paper database, a building paper subsystem for building a new paper, and a scoring paper subsystem for scoring the paper of the examinees. In this paper, we also analyze and discuss the system architecture, the system simulation environment, some key questions and the corresponding solutions.

Keywords: Intelligence, intelligent examination system, E-business application operation, workflow, E-business simulation environment

Title of the Paper: E-exhibition Towards International Trade under the Current Global Economic Crisis


Authors: Zheng Lei-Na, Pan Tie-Jun, Fang Lei-Lei, Yu Yun, Hu Yue, Bao Hui-Han

Abstract: The current financial and economic crisis broke out all over the world which makes the international trade faced with unprecedented challenges, especially to Medium-Sized Enterprise (SME) of China. MICE (Meeting, Incentive, Convention and Exhibition) industry has been regard as the new engine to drag the economic improvement while the traditional exhibition show a decline for the lack of money to enterprise, which make E-exhibition to flourish because the comparative advantages of low cost, high efficiency, unlimited display space, non-restricted scale of operation, and a wide range of audiences, increasing trading opportunities, timely feedback, statistics and electronic evaluation by Web 2.0 and Web 3.0 technologies. We analyze the SME objects and requirements of E-exhibition; give the knowledge management method using semantic web service including product, process, technology, services and application domain sub ontology. Furthermore, we propose E-exhibition logical view using Semantic Web, select the consumer goods as the application domain, and set up an E-exhibition platform by J2EE. In the end, consumer goods ontology architecture is give and a case study is shown.

Keywords: Knowledge management, E-exhibition, international trade, Small and Medium-Sized Enterprise, semantic, ontology, web services

Title of the Paper: Quantitative Analysis of Regional Economic Integration Process Affected by Economy and Trade Cooperation Between China and Japan


Authors: Yu Sun, Guoxin Wu, Changchun Gao

Abstract: With the stepwise advance in comprehensive national strength, keeping a high level in economic progress has been one of the most crucial factors to a nation. In order to keep national economy mount up, such proposal as establishing Economic Integration has been considered to be an effective way, besides setting down various policies. However, why is “East Asian Economic Integration” still far away? This paper is to index that status quo of the trade and economy cooperation between China and Japan has been analyzed, so is with the East Asia Economics Integration. Based on the quantitative analysis in econometrics, we apply lineal and nonlineal regression analysis to get that strong relativity is in between China and Japan. Also Weight Theory, Lineal Modeling and Games Theory have been used to support the argument. The reasons that slow down the steps of Integration are discussed. Furthermore, sticking point will be indicated after conclusive result has been given.

Keywords: Integration of East Asian Economy, Economy and Trade Cooperation between China and Japan, Quantitative analysis, Regression analysis

Title of the Paper: The Enlightenment to the Chinese Insurance Business by analysis on the U.S. Government’s Takeover of American International Group


Authors: Junlu Wang

Abstract: Since the American sub-debt crisis has erupted, has caused US's entity economy and even the whole world capital market fluctuation, Its influence effect does not allow to look down upon. The American International Group (AIG) takes the world insurance and the financial service leader, also has paid the huge price. Regarding China, the immediate influence which the crisis brings is limited The insurance business as the financial service industry's important composition department, This article thought that has the necessity to study it to our country insurance business enlightenment.

Keywords: Sub-debt crisis, American Group International, Chinese insurance business, CDS, CDOs, Financial supervision

Title of the Paper: Research on Personalized Recommendation Based on Web Usage Mining Using Collaborative Filtering Technique


Authors: Taowei Wang, Yibo Ren

Abstract: Collaborative filtering is the most successful technology for building personalized recommendation system and is extensively used in many fields. This paper presents a system architecture of personalized recommendation using collaborative filtering based on web usage mining and describes detailedly data preparation process. To improve recommending quantity, a new personalized recommendaton model is proposed in which takes the good consideration of URL related analysis and combines the K-means algorithm. Experimental results show that our proposed model is effective and can enhance the performance of recommendation.

Keywords: Collaborative filtering, Personalized recommendation, Web usage mining, Data preparation, Cluster algorithm, Similarity

Title of the Paper: Study on Application of Advanced J2EE in Stocks Exchange


Authors: Xiaoyan Yang, Weifeng Yin, Jifang Li, Deliao Yu

Abstract: Along with Java technology progress, the superiority of J2EE 5.0 is gradually appearing. Furthermore the invest of stocks is increasingly hot, so the paper tries to design and implement the system of stocks invest supervision by recent EJB3.0 under Hibernate and Struts conditions based on newly J2EE 5.0 . The paper explores the implementation of EJB persistence of data .The system is tested by several times, it runs effectively. In the mealtime, the paper discusses on the new basic functionality of JDK5.0 and J2EE 5 SDK features

Keywords: Multi-tier, platform-independent, J2EE 5, EJB, stocks, Hibernate, Struts, JDK5.0

Title of the Paper: Bluetooth Channel Quality Simulation, Estimation and Adaptive Packet Selection Strategy


Authors: Guo Feng, Xiao Qimin, Xiao Qili

Abstract: A full duplex Bluetooth simulation model is presented by Simulink of MATLAB. The model includes Bluetooth physical layer and baseband layer. A short range wireless communication channel model is established. The channel model is used that takes into account positioning, propagation effects and radio characteristics. Based on those, the radio performance of Bluetooth system is investigated by simulation, including the throughput of data transmission, Frame Error Rate and Bits Error Rate of SCO link etc. Through the simulation, we can see the detail transmission performances of different packet type. Furthermore, the model can simulate more scenario of Bluetooth communication, such as coexistences with other ISM devices or interferences from other Bluetooth piconets. In conclusion, the model can be used to do a lot of research work, if only little modification is made. Testing system and device designs early in the development process in this way can substantially increase productive efficiency and reduces the risk of design flaws. The transmission performances of Bluetooth 2.0+EDR specification are analyzed, including average maximum throughput under different channel quality. A new adaptive select strategy based on the history of the packets errors is suggested. The channel quality is estimated by the packet error statistics of recent transmission packets. The type of next packet need to send is judged and can be adjusted dynamically. The simulation results show that when the number of the packets used to be decided is about 30, the throughput is greatly close to the average maximum throughput. The strategy takes little overhead of the hardware and software. The judgment procedure is simple and quick. The strategy can be used in the Bluetooth system which has any control unit.

Keywords: Bluetooth, Packet, Channel Quality, Throughput, Packet Error Rate

Title of the Paper: The Degradation, Prevention and Treatment of Black Soil in Jilin Province


Authors: Bian Hong-Feng, Yang Guang, Sheng Lian-Xi, Jiang Jing

Abstract: Northeast China's black soil is mainly distributed in the Songliao Basin’s upper reaches of Heilongjiang Province and Jilin Province. The black soil area in Jilin Province is about approximately 45,200 km2, which accounts for 24.7 percent of the total area. As the result of human interference with black soil in excess, black soil resources in Jilin have depredated noticeably. Serious erosion of black soil, reduction of soil nutrients , the deterioration of physical and chemical properties, and the escalation of soil pollution, have posed a serious threat on the national food production, the security of ecology and environment and socio-economic sustainable development in the future. This paper mainly puts the natural and human factors resulting in the degradation of the black soil into discussion. The natural factors refer to global climate change, the terrain characteristics as well as vegetation cover and so on. The human factors mainly focus on the rapid growth of population, unreasonable way of farming, soil pollution caused by industrial and agricultural production, and urbanization leading to the transfer of soil’s practical function, etc. Through studying the mechanism and the essence of soil degradation, prevention and treatment of black soil degradation should begin with the basic characteristics and the process of the occurrence. .At the same time soil resources and the environment, should be unified, and different types of degradation should be distinguished.. According to the basic principles of ecology and from the perspective of ecological balance the measures of biology, engineer, as well as agriculture should be combined together in order to unify the development and protection.

Keywords: Jilin Province, black soil degradation, prevention and treatment

Title of the Paper: Research on the Interaction Among Interest Rate, Exchange Rate Fluctuations and Capital Market


Authors: Mingliang Li, Xinhui Zhou

Abstract: The interest rate and exchange rate fluctuations have very significant impacts on the domestic economy and capital market development. Adopted from the traditional theory such as the interest rate parity, capital flow theory and the classical theory of IS-LM model, the paper does an in-depth theoretical and empirical research and analysis on the interaction and complex relationship among the three aspects of RMB interest rates, RMB appreciation and securities prices when they change. Research shows that there are strong correlations between Chinese foreign exchange market, money market and stock market; with the RMB appreciation and interest rate increase, the capital accumulation effect and stock assets revaluation effect will promote the overall price level of Chinese Capital market rising. At the same time, in view of the less flexibility in current RMB interest rate and exchange rate, and the reverse impacts between currency appreciation and interest rates, we expect that the effect from RMB appreciation on the capital market will be weakened compared to the past situation of yen appreciation. However, due to the effect that the RMB appreciation is expected to form some international hot money flows, which maybe form a Chinese economic bubble, this trends can not be neglected, it should be did to maintain a high vigilance and to cautiously deal with the relationship between the RMB appreciation and interest rates rise particularly.

Keywords: Currency revaluation, interest rates, capital flows

Title of the Paper: Analysis on Investment Behavior Deviation


Authors: Xinhui Zhou, Mingliang Li

Abstract: The paper adopts from the investment behavior theory and studies the affecting factors of financial investors in investment decision-making and the investment fund behavioral deviation. At the same time, the thesis also analyzes the efficiency of asset allocation and investment behavior of China's securities investment funds, and studies its impact on China's securities market. On the basis of this, it also uses LSV model to test the Fund's herding behavior, the investment behavior of the deviation and its effect on China's market orientation. At the last, it proposes some suggests and points out the paths and strategies on correcting investment behavior deviation of China’s investment fund.

Keywords: Investment Funds; Asset Allocation; Investment Behavior Deviation; Amending Paths

Title of the Paper: Analyzing Privacy and Security Issues in the Information Age - An Ethical Perspective


Authors: Ji-Xuan Feng, Janet Hughes

Abstract: Associated with expanding using ICT across the global, there is more and more concern about privacy and security issues. Today, people find that their personal information is hard to protect. Results from a literature review, from a survey and from a case study all indicate a clear solution: the key is people. It is people who develop ICT, and people who decide the different ways that ICT is used. Therefore education about international law and ethics, and education to develop an understanding of cultural differences will provide a positive attitude to promote the improvement of the privacy and security situation in the information age. Finally, there is a discussion about how to establish a viable security culture environment.

Keywords: Security, Privacy, Law, Ethics, Core value, Global culture education

Title of the Paper: The Analysis of Uncertain Knowledge Based on Meaning of Information


Authors: Ping Cang, Sufen Wang

Abstract: The paper discusses four types of knowledge concepts, especially three kinds of uncertain knowledge, based on relationships among data, information and meaning. The goal is to fuse research results on knowledge engineering and epistemology. We analyze four kinds of basic connection forms (i.e. meaning) between information and dada by using formal context, and obtain their physical signification. Following Dretske’s semantic information theory, we analyze information entropy of the four types of meaning. Then using the information entropy obtained, we look at how data convey information content, and how the types of meaning relate to different concepts in knowledge. Finally, we realize that uncertain knowledge is involved in all the above, therefore we further express the four types of meaning (i.e., four types of connections between information and data) by using Rough Set, and which deepens our understanding of uncertain knowledge.

Keywords: Data, information, meaning, uncertain knowledge

Issue 2, Volume 6, February 2009

Title of the Paper: Modified On-Demand Multicasting Routing Protocol for Ad Hoc Networks


Authors: Amjad Hudaib, Khalid Kaabneh, Mohamad Qatawneh

Abstract: This paper proposes and simulates a new ad hoc multicasting routing protocol called Modified On-Demand Multicast Routing Protocol (MODMRP). The MODMRP is based on On-Demand Multicasting Routing Protocol (ODMRP). MODMRP suggests two approaches aimed to reduce the service traffic; the local detection of routes and the information usage on the channels condition during the route information update timer values. The goal of MODMRP is to improve packet delivery efficiency. Through a course of simulation experiments, the MODMRP is compared to the ODMRP in reference to number factors such as Ratio of Packets Delivery and transmission delays. The Simulation results show an increase in the percentage of delivered multicast datagram on an average of 2% and a routing overhead costs decreased on average of 20.5% when compared to ODMRP.

Keywords: Ad hoc Network, MODMRP, ODMRP, Routing Protocol, On-Demand Multicasting

Title of the Paper: Multi-Bandwidth Data Path Design for 5G Wireless Mobile Internets


Authors: Abdullah Gani, Xichun Li, Lina Yang, Omar Zakaria, Nor Badrul Anuar

Abstract: The 5th generation is envisaged to be a complete network for wireless mobile internet, which has the capability to offer services for accommodating the application potential requirements without suffering the quality. The ultimate goal of 5G is to design a real wireless world, that is free from obstacles of the earlier generations. This requires an integration of networks. In this paper, we propose the design of Multi-Bandwidth Data Path by integrating the current and future networks for new network architecture of 5G real wireless world. We also present our proposed architecture and results of the simulation.

Keywords: 5G, Wireless Networks, Multi-Bandwidth Data Paths, Mobile Internet

Title of the Paper: Pedagogical Resources Management for E-Learning


Authors: Daniel Hunyadi, Iulian Pah

Abstract: E-learning leads to evolutions in the way of designing a course. Diffused through the web, the course content cannot be the direct transcription of a face to face course content. A course can be seen as an organization in which different actors are involved. These actors produce documents, information and knowledge that they often share. We present in this paper an ontology-based document-driven memory which is particularly adapted to an e-learning situation. The utility of a shared memory is reinforced in this kind of situation, because the interactions do not usually occur in the same place and in the same time. First we precise our conception of e-learning and we analyze actors needs. Then we present the main features of our learning organizational memory and we focus on the ontologies on which it is based. We consider two kinds of ontologies: the first one is generic and concerns the domain of training; the second one is related to the application domain and is specific to a particular training program. We present our approach for building these ontologies and we show how they can be merged.

Keywords: E-learning, Ontology, Organizational memory, Topic maps

Title of the Paper: Usability and Performance of Secure Mobile Messaging: M-PKI


Authors: Nor Badrul Anuar, Lai Ngan Kuen, Omar Zakaria, Abdullah Gani, Ainuddin Wahid Abdul Wahab

Abstract: Human life style change substantially when the cellular technology goes commercial. Short Messaging Service (SMS) and Multimedia Message Service (MMS) play important roles in our daily life. The recent report carried out by Mobile Data Association (MDA) [1] shows that the yearly growth of SMS and MMS achieves 30 percent from year 2007 to 2008. Conventional SMS/MMS does not provide any protection on the text message sent. It causes the security threats such as privacy and message integrity. Mobile users seek for the solution to allow them to exchange confidential information in a safe environment. This leads to the implementation of M-PKI, which is an application that secures the mobile messaging service by using public key infrastructure (PKI). This new approach allows the end-user to send private and classified message via SMS. Besides, M-PKI offers message classification. This feature is specially designed to meet various user requirements on the level of security and performance. The usage and performance of M-PKI messaging in performing encryption and decryption process are tested on selected java-enabled phone.

Keywords: Cryptography, message classification, performance, PKI, RSA, SMS/MMS

Title of the Paper: Employing Artificial Immunology and Approximate Reasoning Models for Enhanced Network Intrusion Detection


Authors: Seyed A. Shahrestani

Abstract: With the massive connectivity provided by modern computer networks, more and more systems are subject to attack by intruders. The creativity of attackers, the complexities of host computers, along with the increasing prevalence of distributed systems and insecure networks such as the Internet have contributed to the difficulty in effectively identifying and counteracting security breaches. As such, while it is critical to have the mechanisms that are capable of preventing security violations, complete prevention of security breaches does not appear to be practical. Intrusion detection can be regarded as an alternative, or as a compromise to this situation. Several techniques for detecting intrusions are already well developed. But given their shortcomings, other approaches are being proposed and studied by many researchers. This paper discusses the shortcomings of some of the more traditional approaches used in intrusion detection systems. It argues that some of the techniques that are based on the traditional views of computer security are not likely to fully succeed. An alternative view that may provide better security systems is based on adopting the design principles from the natural immune systems, which in essence solve similar types of problems in living organisms. Furthermore, in any of these methodologies, the need for exploiting the tolerance for imprecision and uncertainty to achieve robustness and low solution costs is evident. This work reports on the study of the implications and advantages of using artificial immunology concepts for handling intrusion detection through approximate reasoning and approximate matching.

Keywords: Intrusion detection, Natural immune system, Soft computing, Approximate reasoning

Title of the Paper: A Superior Choice Making System for Optimal Journey Agency


Authors: Huay Chang

Abstract: This paper presents a decision choice making system method for travelers to find the best journal agency. First of all, the factors those were concerned by the travelers in evaluating the journey agency were submitted through the literature review. These factors are classified into three units. These three units are the price unit, the tour product characteristics unit and the journey agency image unit. Each unit owns its sub-factors. Subequently, the minimum norm approach is used to the classify the optimal fit level between the travelers and the selected journal agency. The findings of this paper appear the understandings of decision key elemtns those influence consumers ‘purchasing lead to the understanding of consumers’ characteristics. Furthermore, the simulation test results appear the adoption of appropriate promotion method in keeping or expanding consumers group become the further key issues for journey agencies.

Keywords: Decision choice making system method, journal agency, consumers, simulation, minimum norm

Title of the Paper: Development of An Online Image Repository System for Cardiac Modeling


Authors: Fariza Hanum Nasaruddin, Maryam Zakeryfar, Rodina Ahmad

Abstract: The development of digital cardiac models has become an area of research all on its own. These researches intend to assist in the understanding of the heart. The data needed to build a digital cardiac model may come in various formats, including medical image data, such as MRI, CT scans, and other scans such as PET and SPECT. The availability of cardiac images is very important in cardiac modeling research. These images are difficult to come by for researchers who are not in the medical field. Thus, any of such available data stored in an on-line system can be made available to other researchers. This paper discusses and elaborates on the design and the development of this on-line cardiac image repository system. The requirement of the system was acquired using the Attribute Driven Design (ADD) method, which is a method for designing software architecture to satisfy both quality requirements and functional requirements. The online system which was developed has only taken into consideration image data, and not others formats of cardiac data such as ASCII flat files, and video files. The online system allows for downloading and uploading of data. With the availability of the data, researchers can spend more time focusing on the modeling processes rather than on the preprocessing and searching for data.

Keywords: Online repository system, image data, cardiac data, cardiac modeling, attribute driven design, digital heart model

Title of the Paper: Assisting Novice Researchers in Utilizing the Web as a Platform for Research: Semantic Approach


Authors: Maizatul Akmar Ismail, Mashkuri Yaacob, Sameem Abdul Kareem, Fariza Hanum Nasaruddin

Abstract: The access to internet has tremendously changed the way of information dissemination. The emergence of digital libraries and institutional repositories provides endless supplies of knowledge. Scholars in particular, make use of research output in the form of conference proceedings, journal and theses as references and guideline in generating new knowledge for the use of future generations. Novice researchers are the form of scholars which always drown in the ocean of information. This scholarly content resides on several heterogeneous databases that need to be integrated. Support in the early stage of study is crucial for novice researchers as it will give them some insights of where to seek for extra information based on institutions, people and research trend without having to go through tedious process of identifying this information all by themselves. Previous studies have identified support features that are useful for novice researchers among which are Relevant Literatures and Expert Detection. Thus, the purpose of this paper is to explore each of these support features and suggest state-of-the art development approach by utilizing Semantic Web technologies. The algorithms involved with each of the support features will be discussed. The result of the implementation shows significant information that can be utilized by novice researchers in accelerating research process. The evaluation of the information retrieved from the support features is elaborated. Future works in enhancing the proposed prototype are also discussed in the concluding remarks.

Keywords: Novice Researchers, Support Features, Semantic Approach

Title of the Paper: Gender Differences in Learning Flickr: A Picture is Worth a Thousand Words


Authors: Eric Zhi Feng Liu, Yu Fang Chang

Abstract: Networking environments such as BBS, discussion forums, websites, wiki, Flickr, and blogs have become increasingly popular. Currently, the term Web 2.0 is widely used. It indicates that users have become the focus of Internet usage; moreover, unlike one-way information delivery, users can now upload and download data freely and conveniently. Flickr is one of the Web 2.0 tools. By using Flickr, users can easily upload photos and also share them with others. Furthermore, users can interact with each other by annotating pictures. For instance, students share creative photos and express their thoughts to peers or others through Flickr. Therefore, we hope to formulate a Flickr course for teaching students how to use this tool. The students’ prior knowledge was analyzed as the first step of the ADDIE model. On the basis of the data analysis, we began to design the Flickr course. The results suggested that most students were satisfied with this course and that they were confident about using Flickr without gender differences.

Keywords: Flickr, ADDIE model, Instructional system design model, Web 2.0, Gender differences

Title of the Paper: A Particle Swarm with Selective Particle Regeneration for Multimodal Functions


Authors: Chi-Yang Tsai, I-Wei Kao

Abstract: This paper proposes an improved particle swarm optimization (PSO). In order to increase the efficiency, suggestions on parameter settings is made and a mechanism is designed to prevent particles fall into the local optimal. To evaluate its effectiveness and efficiency, this approach is applied to multimodal function optimizing tasks. 16 benchmark functions were tested, and results were compared with those of PSO, HNMPSO and GA-PSO. It shows the proposed method is both robust and suitable for multimodal function optimization.

Keywords: Particle Swarm Optimization, Cognitive and Social Parameter, Selective Particle Regeneration, Mutation Operation, Multimodal functions

Title of the Paper: Data Collection System Design in SSM Networks with Unicast Feedback – Server Message Definition


Authors: Martin Koutny, Pavel Silhavy, Jiri Hosek

Abstract: The system for mass data collection with cumulative acknowledge is described in this paper. A server message for cumulative acknowledgement is specified. The system is development for single source multicast networks. These networks are characterized by the ability to communicate with concrete network nodes. The groups of nodes are possible to create in these networks. It makes these networks more efficient than the any source multicast networks. The purpose of the first part is to design and describe the system. In the next part are the possible process mechanism discussed. We also introduce an initialization, synchronization and the message transfers. In last part, we discuss a future work.

Keywords: SSM and ASM networks, unicast feedback, data collection, RTP, RTCP

Title of the Paper: Topical Web Crawling Using Weighted Anchor Text and Web Page Change Detection Techniques


Authors: Divakar Yadav, A. K. Sharma, J. P. Gupta

Abstract: In this paper, we discuss about the focused web crawler and relevance of anchor text as well as method for web page change detection for search engine. We have proposed a technique called weighted anchor text which uses the link structure to form the weighted directed graph of anchor texts. These weights are further used for deciding the relevance of the web pages as the indexing of these pages is done in the decreasing order of weights assigned to them. Weights are assigned for every incoming link for a node of the directed graph. We applied our algorithm on various websites and observed the results. We deduce that the algorithm can be very useful when incorporated with other existing algorithms. As Web usage has increased exponentially in the past few years. This collection of enormous web pages is highly changing and web pages show a rapid change, the degree of which varies from site to site. We discuss the relevance of change detection and then move on to explore the related work in the area. Based on this understanding we propose a new algorithm to map changes in a web page. After verifying results on various web pages we observe the relative merits of the proposed algorithm.

Keywords: Anchor Text, Directed Graph, Topical Focused Crawling, Web Crawler, DOM, Checksum, Change detection

Title of the Paper: The Effect of Emotions and Cognitions on Continuance Intention in Information Systems


Authors: S. C. Wang, Y. S. Lii, K. T. Fang

Abstract: While much of the prior studies on Information system (IS) adoption and usage continuance have examined cognitive factors, the emotion factors in understanding and predicting user behavior remained relatively unexplored. This study proposed a hybrid model that integrates emotion, cognition, satisfaction, and post adoption behavior in order to obtain a better understanding of users’ continuance intention of IS. Three hundred and eighteen Blackboard Learning System (BLS) users were obtained from a survey. The paper assessed the psychometric properties of the measures through confirmatory factor analysis and then employed structural equation modeling analysis in order to examine the prospective model to better predict users’ continuous adoption of IS. The results show that positive and negative emotions mediate the effect of confirmation and directly predict user satisfaction. However, perceived usefulness and perceived ease of use predict the level of user satisfaction better than emotions and perceived usefulness is the stronger predictor of user satisfaction than other variables.

Keywords: Confirmation, Perceived usefulness, Perceived ease of use, Positive emotion, Negative emotion, Satisfaction, Continuous adoption intention

Title of the Paper: Decision Fusion for Improved Automatic License Plate Recognition


Authors: Cristian Molder, Mircea Boscoianu, Iulian C. Vizitiu, Mihai I. Stanciu

Abstract: Automatic license plate recognition (ALPR) is a pattern recognition application of great importance for access, traffic surveillance and law enforcement. Therefore many studies are concentrated on creating new algorithms or improving their performance. Many authors have presented algorithms that are based on individual methods such as skeleton features, neural networks or template matching for recognizing the license plate symbols. In this paper we present a novel approach for decisional fusion of several recognition methods, as well as new classification features. The classification results are proven to be significantly better than those obtained for each method considered individually. For better results, syntax corrections are also considered. Several trainable and non-trainable decisional fusion rules have been taken into account, evidencing each of the classification methods at their best. Experimental results are shown, the results being very encouraging by obtaining a symbol good recognition rate (GRC) of more than 99.4% on a real license plate database.

Keywords: Licence Plate, ALPR, Pattern Recognition, Skeleton Features, Neural Networks, Decision Fusion

Title of the Paper: Predicting the Continuance Usage of Information Systems: A Comparison of Three Alternative Models


Authors: S. C. Wang, Y. S. Lii, K. T. Fang

Abstract: This study seeks to examine, through empirical evidence, the relative explanatory power of three prospective models in predicting users’ continuous adoption of information system (IS). The three models include: Expectation-Confirmation Theory Model (ECTM, Model 1), the integration of ECTM with Technology Acceptance Model (TAM) (ECT-TAM model, Model 2), and a hybrid model integrating ECT, TAM and emotions (Model 3).
Three hundred and fifty web portal site users were obtained from a survey. The paper assessed the psychometric properties of the measures through confirmatory factor analysis and then employed structural equation modeling analysis in order to examine and compare the ability of the three prospective models to better predict users’ continuous adoption of IS.
Data analysis using LISREL shows that all three models meet the various goodness-of-fit criteria. In terms of variance explained for intention to continue IS usage, all three models perform equally well. As for the explanatory power of satisfaction, Model 3 has the highest R2 (71%), followed by Model 2 (69%), and Model 1 (68%). This result confirms the erstwhile discussion of continuance intention behavior in which adding emotion factors to the cognitive process model will enhance the predictive power of the satisfaction. Perceived usefulness and perceived ease of use predict the level of user satisfaction better than emotions and perceived usefulness is the stronger predictor of user satisfaction than other variables. The Model 3 provides additional information to increase our understanding of IS continuance intention behavior.

Keywords: Confirmation, Perceived usefulness, Perceived ease of use, Positive emotion, Negative emotion, Satisfaction, Continuous adoption intention

Title of the Paper: A Study on the Applications of Data Mining Techniques to Enhance Customer Lifetime Value


Authors: Chia-Cheng Shen, Huan-Ming Chuang

Abstract: In today’s competitive environment, a successful company must provide better customized services, that are not only acceptable to customers but satisfy their needs as well, in order to survive and succeed in gaining an advantage against competition. It has been proven by many studies that it is more costly to acquire new customers than to retain old ones. Consequently, evaluating current customers in order to enhance their lifetime value becomes a critical factor to decide the success or failure of a business. This study applies data from customer and transaction databases of a department store, based on the RFM model, and does clustering analysis to recognize high value customer groups for cross-selling promotions. Study findings show that clustering analysis can locate high value customers, and the company can then apply appropriate target marketing to enhance their lifetime value effectively. The implication for the marketer is that leveraging techniques of data mining can make the most from data of customers and transactions databases and thus create sustainable competitive advantages.

Keywords: RFM, Customer Lifetime Value, Analytical Hierarchy Process, Target Marketing, Data Mining

Title of the Paper: Business Failure Prediction Model based on Grey Prediction and Rough Set Theory


Authors: Jao-Hong Cheng, Huei-Ping Chen, Kai-Lun Cheng

Abstract: A lot of methods have been used in the past for the prediction of failure business like Discriminant analysis, Logit analysis, Quadratic Function etc. Although some of these methods lead to models with a satisfactory ability to discriminate between healthy and bankrupt, they endure some limitations, often due to the unrealistic assumption of statistical hypotheses. This is why we have undertaken a hybrid advisable system aiming at weakening these limitations. A hybrid model that predicts the failure firms based on the past financial performance data, combining grey prediction and rough set approach is possible to predict using few data and quickly calculate. The results are very encouraging, compared with original rough set, and prove the usefulness and highlight the effectiveness of the proposed method for firm failure prediction.

Keywords: Grey Prediction, Rough Set Theory (RST), Business Failure, Financial Ratio

Issue 3, Volume 6, March 2009

Title of the Paper: Strategic and Collaborative Approach in Information Literacy Applied in Engineering Teaching


Authors: Angela Repanovici

Abstract: This paper provides an overview of an information skills program integrated into the first year engineering subject "Documentation techniques". This is a problem based learning subject, which requires the students to work through and report on an engineering project. Over the past four years the program has transformed radically as a result of applying an action research framework which is primarily concerned with continual improvement and change in practice. Currently the information skills program consists of a studentled orientation tour, an integrated subject web page (developed using SEARCH AND WRITE tutorial).

Keywords: Engineering, information literacy, information resources, search strategies, search engines, invisible web

Title of the Paper: Optimization of Production Plan through Simulation Techniques


Authors: G. Caputo, M. Gallo, G. Guizzi

Abstract: The production costs reduction in a highly competitive environment is a critical success factor. In particular, in fields where outsourcing is used frequently and the impact of the material costs is high, a significant cost reduction can be achieved making more efficient value chain. Moreover, the use of outsourcing is a way to adjust production capacity depending on demand. Therefore the coordination and planning of the various rings of the chain and the estimate of the productive capability of the various production units become fundamental for a coherent evaluation of promised delivery times. More and more frequently and in greater detail the Companies at the “end of the network” require, through capacity assessment a verification of the available productive capacity of the manufacturing units, both from a quality and quantity point of view, that each single firm can dedicate to a certain type of production. In this paper it is presented a simulation model that allows the elaboration of an operative plan of production through the verification of finite capacity scheduling of resources. The model tends to minimize costs of stocking and set-up, considering other production costs as constant. The simulation is a technique that allows the checking with better precision of the use of the resources with variation of the ties. This approach lead to a high performing instrument in the field of the advanced planning and scheduling through analysis of the various possible views. The auxiliary use of the optimizing tool available in the ARENA software allows an optimal solution, confirming the validity of the study of new applications of the simulative instrument for the verification of productive capacity in the short term.

Keywords: Supply chain management, advanced planning and scheduling, finite capacity scheduling, capacity management, simulation

Title of the Paper: The Development of METAKU to Support Learning in Hypermedia Environment


Authors: Saemah Rahman, Siti Fatimah Mohd Yassin, Normahdiah Sheikh Said

Abstract: The development of learning skills is not given attention in many classrooms in all levels of education in our education system today. We assume that students will be able to develop their own learning skills. The challenges students face in hypermedia learning environment need to be considered to help them learn effectively in this type of learning environment. METAKU is developed to assist students to apply metacognitive learning strategies, that is, planning, monitoring and evaluation during the learning process in hypermedia learning environment. This paper discusses the development of METAKU which employed the first three stages of a generic instructional design model - ADDIE which consists of the following stages: 1) Analysis, 2) Design, 3) Development, 4) Implementation and 5) Evaluation. In the analysis stage, the data was collected using a triangulation method done concurrently: a survey of student’s preference of studying online versus offline; a focus group interview to identify challenges they face and strategies they use whilst accessing and studying the hypertext materials; and a record of student’s interaction with the computer using captivation software. A total of 240 second year university students in two public universities in Malaysia were involved in this study. The analysis stage provides information for stage 2 and 3 where the data collected was used in formulating the content and design of METAKU. It is hoped that METAKU will be able to help students develop learning skills in hypermedia learning environment.

Keywords: Hypermedia learning environment, METAKU learning strategies, Metacognitive learning strategies, Online study skills

Title of the Paper: A Global Model for Virtual Educational System


Authors: Daniel Hunyadi, Iulian Pah, Dan Chiribuca

Abstract: This paper present a virtual educational environment model which makes learning easier by using collaboration (and extension, team-research model) as a form of social interplay. The model represents a universe where human agents interact with artificial agents (software agents). Considering the vision of the system, it can be classified among advanced systems for it is client-oriented (student) and provides value added educational services, due to the collaborative learning attribute. The model proposes an original architecture where elements of the socio-cultural theory of collaborative learning are assigned to the artificial intelligence components (the multi-agent system). The expected results are: conceptual models (agents, learning and teaching strategies, student profiles and group profiles, communication between agents, negotiation strategies and coalition formation), software entities, and a methodology to evaluate the performance of eLearning systems.

Keywords: Socio-cultural models, multi-agent system, multi-agent architectures, collaborative learning, artificial agents, computer-based learning, distance learning

Title of the Paper: A Consumer Support Architecture for Enhancing Customer Relationships


Authors: Jyhjong Lin

Abstract: For enterprises, customer relationships have been commonly recognized as a critical factor to succeed their business. Effective customer relationships could help enterprises deliver services/products to customers based on their needs, preferences, or past transactions. This model however emphasizes on the use of customer information for benefiting enterprises; customers in contrast receive less information from enterprises. To address this issue, such technologies have been proposed as recommendation systems and intelligent agents that provide customers with more sufficient information for their possible needs/helps. These technologies, nevertheless, have still some shortcomings that initiate the recent discussion of a new paradigm, namely Consumer Support Systems (CSS). CSS is specifically structured to support effective information provision from enterprises to consumers where sophisticated information management mechanisms are necessarily employed to help on consumer decision making. In this paper, we present an architecture for the construction of such a new CSS paradigm that provides an advanced management of customer relationships by emphasizing on the information provision from enterprises to consumers. The architecture starts from the identification of CSS characteristics, through the recognition of architectural components that support the realization of these issues, and finally ends with the specification of collaborations among architectural components to realize these issues. The architecture is modeled by UML notations and illustrated by a CSS for book publishing.

Keywords: Customer relationship management, consumer support system, architecture, UML

Title of the Paper: Controlling and Disclosing your Personal Information


Authors: Norjihan Abdul Ghani, Zailani Mohamed Sidek

Abstract: As organizations come to rely on the collection and use of personal information in order to complete the transactions and providing good services to their users, more and more user personal information is being shared with web service providers leading to the need to protect the privacy. Personal information is processed, stored and disclosed and often it generated in the course of making a commercial exchange. Credit card numbers, individual identity number, purchase records, monthly income, and related types of personal information all have important role with his this commercial information system. However this creation and use of personal information raises issues of privacy not only for the individual, but also for organizations. Easy access to private personal information will cause the misuse of data, no control over the information and others. Because of this, it’s important to protect the information not only from external threats but also from insider threats. Data disclosure when performing a task in web-based application should be ensured. Within the electronic scenario, personal information have been collected, stored, manipulated and disclosed without the owner’s consent. This paper will discuss on the relationship between personal information and its privacy. We also extended the model introduced by Al-Fedaghi as a way to control the personal information disclosure. We also suggested that the use of Hippocratic Database concepts as a way to control the personal information disclosure.

Keywords: Personal information, privacy, personal information flow model, Hippocratic Database

Title of the Paper: Personal Information Privacy Protection in E-Commerce


Authors: Norjihan Abdul Ghani, Zailani Mohamed Sidek

Abstract: Today, the world are moving towards e-commerce application in completing their daily jobs. An e-commerce application becomes the preferred medium to complete the day’s tasks. Electronic commerce or e-commerce is a potentially growing business for today’s market. Basically, online shopping eliminates conventional purchase approach which is labor-intensive and time-consuming. Through cyber space, order can be placed electronically and the product will be produced and shipped with the middleman. The potential for wide-ranging surveillance of all cyber activities presents a serious threat to information privacy. It gives more bad results in personal information privacy. In any e-commerce activities, all personal information should be controlled including their disclosure in order to protect its privacy. This paper discusses how personal information is used in e-commerce application and how it should be controlled.

Keywords: Personal information, information privacy, electronic commerce

Title of the Paper: Employee Turnover: A Novel Prediction Solution with Effective Feature Selection


Authors: Hsin-Yun Chang

Abstract: This study proposed to address a new method that could select subsets more efficiently. In addition, the reasons why employers voluntarily turnover were also investigated in order to increase the classification accuracy and to help managers to prevent employers’ turnover. The mixed feature subset selection used in this study combined Taguchi method and Nearest Neighbor Classification Rules to select feature subset and analyze the factors to find the best predictor of employer turnover. All the samples used in this study were from industry A, in which the employers left their job during 1st of February, 2001 to 31st of December, 2007, compared with those incumbents. The results showed that through the mixed feature subset selection method, total 18 factors were found that are important to the employers. In addition, the accuracy of correct selection was 87.85% which was higher than before using this feature subset selection method (80.93%). The new feature subset selection method addressed in this study does not only provide industries to understand the reasons of employers’ turnover, but also could be a long-term classification prediction for industries.

Keywords: Voluntary Turnover; Feature Subset Selection; Taguchi Methods; Nearest Neighbor Classification Rules; Training pattern

Title of the Paper: Data Protection for Land Consolidation with Distortion Tolerable LSB Watermarking


Authors: Li Li, Chao Zhang, Mingxin Liang, Daoliang Li

Abstract: With the rapid growth of Internet technology, data security protection has become an important issue. As a tool for improving the effectiveness of land cultivation, land consolidation plays an important role in enlarging farming plots and enhancing crop productivity. To fulfill land consolidation, more and more remote sensing images are widely used. This paper presents a LSB watermarking algorithm to protect the images data security. In the proposed scheme, the land reorganization planning map is embedded into a remote sensing image of the same area for information security and copyright protection. Before being embedded, the land reorganization planning map is compressively encoded. The contradiction between large information quantity and invisibility of digital water marking is perfectly eliminated. To avoid the shortcoming of being easily attacked in the original LSB method, the host image is preprocessed by exclusive-OR (XOR) operation. That is, the value of the least significant bit is decided by the XOR result of the first several bits. The watermark signal is embedded into the LSB of the processed host image by XOR operation. Moreover, by applying an optimal adjustment process, the watermarked image has the ability of distortion tolerance. The experimental results fully illustrate that the embedded watermark is not only imperceptible to the human eyes, but also has better security in the statistical sense. Moreover, the proposed scheme is superior in terms of its distortion tolerance.

Keywords: Land consolidation, watermarking, least significant bit (LSB), exclusive-OR (XOR) operation, remote sensing image, data protection, land reorganization planning map

Title of the Paper: A Hierarchical Object Oriented Method for Land Cover Classification of SPOT 5 Imagery


Authors: Wei Su, Chao Zhang, Xiang Zhu, Daoliang Li

Abstract: Land cover classification with a high accuracy is necessary, especially in waste dump area, accurate land cover information is very important to eco-environment research, vegetation condition study and soil recovery destination. Funded by the international cooperation project Novel Indicator Technologies for Minesite Rehabilitation and sustainable development, a hierarchical object oriented land cover classification is produced in this study. The ample spectral information, textural information, structure and shape information of high resolution SPOT 5 imagery are used synthetically in this method. There are two steps in object oriented information extraction: image segmentation and classification. First, the image is segmented using chessboard segmentation and multi-resolution segmentation method. Second, NDVI is used to distinguish vegetation and non-vegetation; vegetation is classified as high density vegetation, middling density vegetation and low density vegetation using spectral information, object oriented image texture analysis; non-vegetation is classified as vacant land and main road using length/width. Accuracy assessment indicate that this hierarchical method can be used to do land cover classification in waste dump area, the total accuracy increases to 86.53%, and Kappa coefficient increases to 0.7907.

Keywords: Hierarchical land cover classification, NDVI, object oriented texture analysis, waste dump opencast coalmine area, SPOT 5

Title of the Paper: Monitoring of Landscape Change for Waste Land Rehabilitation in Haizhou Opencast Coal Mine


Authors: Yingyi Chen, Daoliang Li

Abstract: Land rehabilitation is being carried out throughout the whole country. But in many areas, the main purpose of land rehabilitation is to increase the overall cultivated land area which neglects the eco-construction. Important tasks of modern landscape ecology are to monitor and assess natural resources, to examine the impacts and effects of human intervention and, last but not least, to observe the state of the environment over long periods of time. The objective of this research was to create a method for land rehabilitation project using landscape ecology by combining Geographical Information Systems (GIS) and Landscape Ecology Analysis (LEA). GIS technologies were developed for the digital preparation and analysis of historical maps, and subsequent digital land use mapping. The landscape spatial pattern and influence on landscape were expressed by dominance index, contagion index, cohesion index, etc. Applying those landscape ecology indexes landscape spatial pattern influence caused by the land development and rehabilitation planning of Haizhou coal mine waste area was studied by comparing landscape characteristics between those before and after planning implementation. The analysis of the structural landscape changes proved to be an important aspect.

Keywords: Monitoring, Landscape Ecology, Land rehabilitation, opencast coal mine

Title of the Paper: Soil Environmental Quality Assessment in Sustainable Rehabilitation of Mine Waste Area: Establishing an Integrated Indicator-based System


Authors: Xiang Zhu, Yingyi Chen, Daoliang Li

Abstract: Soil environmental quality is the capacity of a soil to function, within ecosystem and land use boundaries, to sustain biological productivity, maintain environmental quality, and promote plant, animal and human health'. In the long-term, vegetative rehabilitation of mining wastes aims at, as far as possible, the proper ecological integration of the reclaimed area into the surrounding landscape, which is sustainable and requires minimal maintenance. This article presents here an indicator-based system of soil environmental quality that evaluates sustainable rehabilitation of mine waste through a set of 2 subindicators, chemical fertility and stocks of organic matter, and further combines them into a single general Indicator of Soil Quality (GISQ). The design and calculation of the indicators were based on sequences of multivariate analyses. Principal component analysis (PCA) was used to assess soil quality overall. A GISQ combined the different subindicators providing a global assessment of soil environmental quality. Our findings provide evidence that selected indicators can provide a definitive, quantitative assessment of soil environmental quality and lend credence to the value of our approach in quantifying relationships between soil function and indicators for specific areas.

Keywords: Soil environmental quality; Sustainable rehabilitation, Mine waste area; Indicator-based

Title of the Paper: Feature Extraction Method for Land Consolidation from High Resolution Imagery


Authors: Rui Guo, Daoliang Li

Abstract: Land consolidation is a tool for increasing the area of the arable land and improving the effectiveness of land cultivation. With the development of high resolution image, the progress of land consolidation project can be monitored by acquiring information from the image objectively. This paper presents a method to extract the wells and roads in land consolidation project from high resolution images. The well extraction method is based on the gray-level template matching algorithm. The road extraction method is based on mathematical morphology, which is a method for detecting image components that are useful for representation and description. The vector planning maps and high resolution images used to monitor the completion of land consolidation project are registered. The candidate areas are created using the functions of buffer and extraction by mask in GIS. The well template is selected manually from the image. The template is used to find the wells which match the template perfectly. In the road extraction section, Top-hat transform and gray dilation are used to filter the noise of the image. In this way the road feature in the image became wider and even more obvious to be recognized. Then image binarization and thinning algorithm are used to extract the one-pixel centerline of the road. At last, the thinning results are converted to the final vector detection results.

Keywords: Feature extraction, Mathematical morphology, Template matching, Land consolidation, Remote sensing

Title of the Paper: Comparison of Pixel-Based and Object-Oriented Knowledge-Based Classification Methods using SPOT5 Imagery


Authors: Minjie Chena, Wei Sua, Li Lia, Chao Zhanga, Anzhi Yuea, Haixia Lia

Abstract: Land cover mapping is very important for evaluating natural recourses, understanding the societal and business activities. The remote sensing techniques provide effective and efficient methods to create such maps. To high spatial resolution imagery such as SPOT5 imagery, the land cover classification precision will be improved with the knowledge, Digital Elevation Model (DEM) data and the spatial information such as texture and. Both the pixel-based classification method based on the knowledge rule and the object-oriented fuzzy classification methods have been studied in this paper using SPOT5 high spatial resolution imagery. Some GIS dataset and the texture are integrated into the two knowledge-based classifications in this paper. And the result of accuracy assessment indicates that the two classifications can catch good classification precision, but the objected-oriented classification method does better. Besides, the shape and context information can be used fully to distinguish the roads from the buildings with the objected-oriented fuzzy classification method, which is hard to accomplish in pixel-based classification. Furthermore, the objected-oriented classification method is more suitable in land cover mapping due to its meaningful objects.

Keywords: SPOT5 imagery, Texture, DEM, Pixel-based, Object-oriented, Knowledge-based rules

Title of the Paper: Modelling and Method for Beef Quality Risk Identification and Optimization in Beef Cattle Breeding


Authors: Hui Li, Jian Zhang, Lingxian Zhang, Liang Shi, Daoliang Li, Zetian Fu

Abstract: The meat industry is seeking to establish reassurance on traceability and production techniques that may help to promote confidence in the integrity and origin of the products. The overall tracing of beef quality is actually the risk identification and control along the supply chains of beef production. This study focused on the identification and control methods of quality risks in China traditional beef cattle breeding with optimization approaches concerned. The quality risks in beef cattle breeding were classified by features to develop the disseminated model of these risks and its algorithms. Then the theory of quality traceability was used and a concept model of quality risk control was proposed. Tracing units were divided and tracing nodes were set through optimization. The proposed risk identification and control models were capable of identifying and handling a product and the information attached to it throughout the whole production process to retail packs.

Keywords: Beef Breeding, Risk Identification, Risk Control, Traceability

Title of the Paper: Melancholia Diagnosis Based on GDS Evaluation and Meridian Energy Measurement using CMAC Neural Network Approach


Authors: Chin-Pao Hung, Hong-Jhe Su, Shih-Liang Yang

Abstract: In this paper, a preliminary result about the melancholia diagnosis scheme based on the analyses of depression questionnaire and the meridian energy of human body using CMAC (Cerebellar Model Articulation Controller) neural network approach is proposed. Firstly, a large amount data obtained from hospital, recorded the aged patients’ depression rating scales and the 12 sets meridian energy signals, are sieved out three disease groups’ patterns. Assuming the recorded data can describe the necessary features of the melancholia patient. Then, we built a CMAC neural network to learn the melancholia features depending on the three disease groups’ patterns. By the sufficient training, the diagnosis architecture will memorize the features of the selected melancholia patient patterns. Finally, the built diagnosis system can used to diagnose the depression scale by inputting the 12 sets meridian energy signals of human body into CMAC neural network. To benefit the pattern collection, re-training, diagnosis and the data analyses, a PC-based friendship operation interface is developed in this paper also. Such as the function of new pattern addition, retraining, and the memory weights distribution plots are appeared in the interface.

Keywords: Energy medicine, disease diagnosis, melancholia, depression questionnaire, CMAC, GDS, neural network

Title of the Paper: Value of Project Management – A Case Study


Authors: Pasi Ojala

Abstract: The amount of software has increased in several products. Software projects have become more complex and their management requires significant amount of skills from every project manager. The amount of available resources, strict budgets, cost control and need for accurate reporting and documentation as well as good quality are part of every project managers’ life. As business challenges project managers more and more it would be useful to know what areas of project management create biggest value to the projects. Value Engineering has been a usable method for developing high value products for several years. It has been applied successfully to software processes as well as to software product development. This research analyses the value of project management using Value Engineering based value assessment. This is done in part by defining the concepts of value, worth, cost and in part by outlining the Value Engineering process with project management practices. The practical industrial case shows that there is big variety in value between typical project management tasks. It also shows that value of project management tasks can be improved using value Engineering based value assessment.

Keywords: Project management, software engineering, value engineering, worth, cost, value

Title of the Paper: Histogram Remapping as a Preprocessing Step for Robust Face Recognition


Authors: Vitomir Struc, Janez Zibert, Nikola Pavesic

Abstract: Image preprocessing techniques represent an essential part of a face recognition systems, which has a great impact on the performance and robustness of the recognition procedure. Amongst the number of techniques already presented in the literature, histogram equalization has emerged as the dominant preprocessing technique and is regularly used for the task of face recognition. With the property of increasing the global contrast of the facial image while simultaneously compensating for the illumination conditions present at the image acquisition stage, it represents a useful preprocessing step, which can ensure enhanced and more robust recognition performance. Even though, more elaborate normalization techniques, such as the multiscale retinex technique, isotropic and anisotropic smoothing, have been introduced to field of face recognition, they have been found to be more of a complement than a real substitute for histogram equalization. However, by closer examining the characteristics of histogram equalization, one can quickly discover that it represents only a specific case of a more general concept of histogram remapping techniques (which may have similar characteristics as histogram equalization does). While histogram equalization remapps the histogram of a given facial image to a uniform distribution, the target distribution could easily be replaced with an arbitrary one. As there is no theoretical justification of why the uniform distribution should be preferred to other target distributions, the question arises: how do other (non-uniform) target distributions influence the face recognition process and are they better suited for the recognition task. To tackle this issues, we present in this paper an empirical assessment of the concept of histogram remapping with the following target distributions: the uniform, the normal, the lognormal and the exponential distribution. We perform comparative experiments on the publicly available XM2VTS and YaleB databases and conclude that similar or even better recognition results that those ensured by histogram equalization can be achieved when other (non-uniform) target distribution are considered for the histogram remapping. This enhanced performance, however, comes at a price, as the nonuniform distributions rely on some parameters which have to be trained or selected appropriately to achieve the optimal performance.

Keywords: Face recognition, preprocessing tehniques, histogram equalization, histogram remapping

Title of the Paper: Development of a Data Warehouse for Lymphoma Cancer Diagnosis and Treatment Decision Support


Authors: Teh Ying Wah, Ong Suan Sim

Abstract: Data warehousing is becoming an indispensable component in data mining process and business intelligence. Data warehouses often act as a data collector, data integrator and data provider in the data mining process. This paper reviews the development and use of a clinical data warehouse specific to the Lymphoma or Lymph Node cancer, which could be used by doctors, physicians and other health professionals, in conjunction with a Clinical Decision Support System (DSS), to support the clinical process as well as to formulate the appropriate model to improve the quality of diagnosis and treatment recommendation decision making. This paper proposes a 5-stage sequential methodology for the clinical data warehouse development. Research on the evaluation of the developed data warehouse and how it would support the data mining process will be discussed in a separate paper.

Keywords: Clinical data warehouse, Clinical Decision Support System (DSS), Lymphoma or Lymph Node cancer, diagnosis and treatment recommendation decision making

Title of the Paper: Information Model of Intelligence and Memorizing in Early Childhood


Authors: Vinko Viducic, Damir Boras, Ljiljana Viducic

Abstract: The object of analysis of this paper is global model of intelligence and memorizing of information from conception until a child starts to walk for the period from 2009 until 2020. In order to define main characteristics and determinants, the model of growth was used. Numerous adequate scientific methods were used, the most important among them: analysis and synthesis method, inductive and deductive method, descriptive method, comparative method, statistical and mathematical method, modelling method (matrix of growth), proving and disproving method. As model variables are chosen key factors for strengthening of intelligence, and therefore of memorising of information. Quantitative analysis is than used in order to determine importance of individual variables as well as their interdependence.

Keywords: Intelligence, learning, information memorizing, information model

Issue 4, Volume 6, April 2009

Title of the Paper: Analyses of Task Based Learning in Developing "M-Learn" Mobile Learning Software Solution: Case Study


Authors: Majlinda Fetaji, Bekim Fetaji

Abstract: Developing mobile software solutions in order to enhance learning in University environments by using the new mobile communication technology is becoming a very popular research focus. However there is a lack of research on the instruction method that should be used in the development of mobile learning software as a very important factor in the learning process and in the software engineering process. In this paper we investigate, develop and analyses a mobile software solution in using the Task Based Learning model. The research focuses on the developed software solution and reviews learning modeling approach focusing on task based learning that from the findings is concluded as the best approach for mobile learning. For testing purposes developed is a software prototype called M-Learn and its analyses has been investigated and recommendations for developing similar software solutions have been defined.

Keywords: Task based learning, mobile software development, Learning instructions, m-learning

Title of the Paper: Shape Matching by Curve Modelling and Alignment


Authors: Cecilia Di Ruberto, Marco Gaviano, Andrea Morgera

Abstract: Automatic information retrieval in the field of shape recognition has been widely covered by many research fields. Various techniques have been developed using different approaches such as intensity-based, modelbased and shape-based methods. Whichever is the way to represent the objects in images, a recognition method should be robust in the presence of scale change, translation and rotation. In this paper we present a new recognition method based on a curve alignment technique, for planar image contours. The method consists of various phases including extracting outlines of images, detecting significant points and aligning curves. The dominant points can be manually or automatically detected. The matching phase uses the idea of calculating the overlapping indices between shapes as similarity measures. To evaluate the effectiveness of the algorithm, two databases of 216 and 99 images have been used. A performance analysis and comparison is provided by precision-recall curves.

Keywords: Curve alignment, Information retrieval, Object recognition, Image indexing, Precision-recall

Title of the Paper: A Framework and Implementation of Information Content Reasoning in a Database


Authors: Xiaochuan Wu, Junkang Feng

Abstract: Databases’ capability is limited in terms of inference. Especially, when users explore information beyond the scope of data within databases, the databases normally cannot provide the information. The underlying reason of the problem is that queries are answered based on a direct match between a query and data (up to aggregations of the data). We observe that it is possible to find information from a database beyond that. To this end, we propose a framework for information content reasoning in a database. A number of basic concepts are defined first. Then we present the framework and explain how it works. Moreover, we describe how such a framework is implemented by means of a prototype including a test with sample queries.

Keywords: Information content, Reasoning, Knowledge discovery from databases, Semantic theory of information, Databases

Title of the Paper: A Novel Approach for Missing Data Processing based on Compounded PSO Clustering


Authors: Hung-Pin Chiu, Tsen-Jenwei, Hsiang-Yi Lee

Abstract: Incomplete and noisy data significantly distort data mining results. Therefore, taking care of missing values or noisy data becomes extremely crucial in data mining. Recent researches start to exploit data clustering techniques to estimate missing values. Obviously the quality of clustering analysis significantly influences the performance of missing data estimation. It was proven that clustering problem is NP-hard. Particle swarm optimization (PSO) is the recently suggested heuristic search process for solving data clustering problems. In this paper, a compounded PSO (CPSO) clustering approach is proposed for the missing value estimation. Normalization methods are first utilized to filter outliers and prevent some attributes from dominating the clustering result. Then the K-means algorithm and reflex mechanism are combined with the standard PSO clustering so that it can quickly converge to a reasonable good solution. Meanwhile, an iteration-based filling-in value scheme is utilized to guide the searching of CPSO clustering for the optimal estimate values. Effectiveness of the proposed approach is demonstrated on some data sets for four different rates of missing data. The empirical evaluation shows the superiority of CPSO over the well known K-means, PSO, and SOM-based approaches, and it is desirable for solving missing value problems.

Keywords: Particle swarm optimization, Data clustering, Missing values, Iteration-based filling-in scheme

Title of the Paper: UWER: An Extension to a Web Engineering Methodology for Rich Internet Applications


Authors: Leonardo Machado, Orlando Filho, Joao Ribeiro

Abstract: This paper introduces UWER, an extension to an existing Web Engineering Methodology (UML For Web Engineering – UWE) to model Rich Internet Applications (RIA). Initially it presents the basic concepts behind RIA and UWE, showing the reasons for choosing this methodology among many others. After presenting the proposed extensions for UWER, a modeling instance is shown using a real web site with RIA features. This work concludes with future directions and issues to be addressed.

Keywords: Rich Internet Applications, Web Engineering Methodology, Unified Modeling Language

Title of the Paper: Supporting Architectural Design Decisions through Risk Identification Architecture Pattern (RIAP) Model


Authors: Thamer Al-Rousan, Shahida Sulaiman, Rosalina Abdul Salam

Abstract: Web projects tend to have a high possibility of loss or failure compared to traditional projects. For this reason, risk management is becoming more emphasized and systemized in Web projects so as to improve the quality of difficult decisions that normally encompass a higher level of risk exposures. Software architecture process is seen as iterative process and the amount of risk-related software architecture artifacts in each iteration of the process differ from that of other iterations. Each iteration of a process needs a unique decision-making process to accommodate certain risk factors. Since each iteration of Web project design has different types of risks in the decision-making process, a decision support system should be tailored to satisfy the specific needs of a particular iteration. In this way, various risks that arise through the life cycle of a project can be constantly checked and monitored. This research aims to support architectural design decision-making process through a risk identification architecture pattern model called RIAP. The model is anticipated to clarify high-level design process and to support active design decisions. Consequently, the software architecture becomes easier to communicate, maintain and evolve. Furthermore, it supports the analysis, improvement, and reuse of architectural design decisions process in future Web projects.

Keywords: Risk identification architecture pattern, software architecture, architectural design

Title of the Paper: Comparison between Computed Shearing Forces by AASHTO Specifications And Finite Element Method of Two Continuous Spans of Voided Slab Bridge


Authors: Maher Qaqish, Emad Akawwi, Eyad Fadda, Maher Qaqish

Abstract: Voided slab bridge deck composed of two continuous span is 17.3m Long and 16m width with all over depth of 0.9m. The voided slab is composed of 20cm top and bottom slabs with circular voided areas of 50cm in diameter and center to center of these circles of 75cm. The locations of AASHTO loadings are positioned at certain points of deck slab to give maximum positive and negative shearings. These locations are determined from one dimensional model. The analysis of the bridge deck is carried out by two approaches. The first is AASHTO specifications where one dimensional approach is adopted and the second is three dimensional approach where finite element analysis is considered. The maximum shears obtained by both methods are found to be in good agreement with negligible differences.

Keywords: Bridges, voided slab, AASHTO Loadings and Finite Element Method

Title of the Paper: The Considerations of the Web Page Design


Authors: Jenn Tang

Abstract: The web page interface is often one of the key factors that determine the browsers to stay or to leave. Limited visual space should be utilized to design the optimal layout in order to offer the important message of web page, increase the surfing, and, furthermore, fully stimulate the click-through rate. Based on content analysis method, this study takes 326 female shopping sites as examples to categorize 45 main entities from most websites and further obtains the web page layout of e-commerce in which there are 7 principles of uniformity for general entities and 5 for specific entities. According to these findings, another usability tests been conducted that are user friendly, usefulness, ease of use regarding other random searched shopping sites for verification. We did discover that the entities possess the characteristics of position-oriented. Our results offer guidelines for planning e-commerce web pages.

Keywords: E-commerce, web page layout, shopping sites, female

Title of the Paper: Improving Cache Global Consistency and Hit Ratio in Dependency Objects with Semantic Spatial Locality Correlations


Authors: Ching-Shun Hsieh, Jui-Wen Hung

Abstract: On requesting cache data in disk, the distribution of spatial locality is critical to access performance. Unfortunately, spatial locality properties of cached data is largely ignored, and only temporal locality is considered. Besides, an individual disk object might induce different dependency relations in different applications and possibly partial dependency to several distributed original data source (ODS). This interesting property might be neglected. Normally, these situations can be improved by effectiveness of storage caching, prefetching, and prediction of the user navigation behavior, data layout of storage systems and global consistency storage replicas. In this paper, we consider the problem and solve it using data mining techniques and a service routable consistency framework (Global Distributed Hierarchical Cache Consistent Model; GDHCCM). The model based on a scalable routing service algorithm that dynamic reconfiguration forwarding data path within hierarchical enterprise region portals. A novel hypergraph scheme was also proposed to represent the complex object relations among the applications. Instead of a local measure that depends only on common objects among patterns, we propose a global measure process based on the semantic properties of these patterns in the overall data set. The experiments show the effectiveness of the proposed framework. Apply scenario can reduce the global patch service cost; improve performances and minimum the turnaround time in access the scope of computer games or virtual environments (VE).

Keywords: Cache, spatial locality, global consistency, hypergraph, prefetching, service routing, virtual environments

Title of the Paper: Client/Server System for Managing an Audio and Video Archive for Unique Bulgarian Bells


Authors: Tihomir Trifonov, Tsvetanka Georgieva

Abstract: In this paper, a client/server system for the management of data and its extraction from an audio and video archive for unique Bulgarian bells is proposed. The realized system provides users the possibility of accessing information about different characteristics of the bells, according to their specific interests. The architecture of the Web based system is described as well as the services offered. The authors present the structure of the created database that stores the necessary information. A client application is realized with MatLab, providing the possibility for searching the bells from the archive according to their sound.

Keywords: Web technologies, database, client/server system, audio/video archive, bell, sound, digital signal processing, spectral analysis, digital filter, wavelet analysis, fractal dimension

Title of the Paper: The Geological Model and the Groundwater Aspects of the Area Surrounding the Eastern Shores of the Dead Sea (DS) - Jordan


Authors: E. Akawwi, M. Kakish, N. Hadadin

Abstract: Many different cross sections were created along the eastern shores of the Dead Sea (DS). These geological cross sections were used to develop a geological model of the DS area and determine the groundwater directions in the area surrounding DS. The geological model showed that the direction of the groundwater flow is to the west and northwest directions toward the Dead Sea. This model shows that the most of the geological units dip to the west and southwest directions toward the Dead Sea. In the area adjacent to the eastern shores of the Dead Sea the B2/A7 which is defined as upper aquifers were eroded and the Kurnub sand stone, Zarka and Ram sandstone group which are defined as a lower aquifers crop out.It shows that the groundwater flows from the east and northeast to the west and southwest toward the Dead Sea. At the east of the Dead Sea the upper aquifer is unsaturated because it crops out.

Keywords: Dead Sea, geology, Model, groundwater, Dip, Aquifer, Kurnub

Title of the Paper: Multi Step Ahead Prediction of North and South Hemisphere Sun Spots Chaotic Time Series using Focused Time Lagged Recurrent Neural Network Model


Authors: Sanjay L. Badjate, Sanjay V. Dudul

Abstract: Multi–Step ahead prediction of a chaotic time series is a difficult task that has attracted increasing interest in recent years. The interest in this work is the development of nonlinear neural network models for the purpose of building multi-step ahead prediction of North and South hemisphere sunspots chaotic time series. In the literature there is a wide range of different approaches but their success depends on the predicting performance of the individual methods. Also the most popular neural models are based on the statistical and traditional feed forward neural networks. But it is seen that this kind of neural model may present some disadvantages when long-term prediction is required. In this paper focused time lagged recurrent neural network (FTLRNN) model with gamma memory is developed not only for short-term but also for long-term prediction which allows to obtain better predictions of northern and southern chaotic time series in future. The authors experimented the performance of this FTLRNN model on predicting the dynamic behavior of typical northern and southern sunspots chaotic time series. Static MLP model is also attempted and compared against the proposed model on the performance measures like mean squared error (MSE), Normalized mean squared error (NMSE) and Correlation Coefficient (r) .The standard back propagation algorithm with momentum term has been used for both the models. The various parameters like number of hidden layers, number of processing elements in the hidden layer, step size, the different learning rules, the various transfer functions like tanh, sigmoid, linear-tanh and linear sigmoid, different error norms L1,L2 (Euclidean), L3, L4 ,L5 and L?, and different combination of training and testing samples are exhaustively varied and experimented for obtaining the optimal values of performance measures. The obtained results indicates the superior performance of estimated dynamic FTLRNN based model with gamma memory over the static MLP NN in various performance metrics. In addition, the output of proposed FTLRNN neural network model with gamma memory closely follows the desired output for multi- step ahead prediction for all the chaotic time series considered in the study.

Keywords: Sunspots chaotic time series, multi- step prediction, Focused time lagged neural network (FTLRNN) , Multilayer perceptron (MLP), Self organizing feature map (SOFM).

Title of the Paper: A Document Protection Scheme using Innocuous Messages as Camouflage


Authors: Ching-Sheng Hsu, Shu-Fen Tu, Young-Chang Hou

Abstract: Lin and Lee proposed a document protection scheme which utilized a meaningful document to cover the secret document. Although some researchers extended and improved their scheme, the main drawbacks of Lin and Lee’s scheme are still left unsolved. The aim of our study is to propose a new document protection scheme to solve these drawbacks. Instead of encoding the secret codes into index numbers, we generate the cipher message through a series of comparisons between cheating codes and the logic operator XOR. Compared with other studies, ours have the following advantages: firstly, the selection of the cheating document needs not to be restricted to the character set of the secret document; secondly, the length of the encoded file is the same as that of the secret document; thirdly, the codes of the cipher message are almost uniformly distributed so is difficult to analyze without the key; fourthly, with the help of inner codes, our scheme is applicable to documents in any languages; finally, our scheme performs efficiently and is easy to implement.

Keywords: Document Protection Scheme, Steganography, Cryptography, Encoding Method

Issue 5, Volume 6, May 2009

Title of the Paper: On the Parallelism of I/O Scheduling Algorithms in MEMS-Based Large Storage Systems


Authors: Eunji Lee, Kern Koh, Hyunkyoung Choi, Hyokyung Bahn

Abstract: MEMS-based storage is being developed as a new storage media that has several salient characteristics such as high-parallelism, high density, and low-power consumption. Because physical structures of MEMS-based storage is different from those of hard disks, new software management techniques for MEMS-based storage are needed. Specifically, MEMS-based storage has thousands of parallel-activating heads, which requires parallelism-aware request scheduling algorithms to maximize the performance of the storage media. In this paper, we compare various versions of I/O scheduling algorithms that exploit high-parallelism of MEMS-based storage devices. Trace-driven simulations show that parallelism-aware algorithms can be effectively used for high capacity mass storage servers because they perform better than other algorithms in terms of the average response time when the workload intensity becomes heavy.

Keywords: MEMS-based storage, Parallelism, Request Scheduling, Scheduling algorithm, Storage

Title of the Paper: Recommendation System based on the Clustering of Frequent Sets


Authors: Andrei Toma, Radu Constantinescu, Floarea Nastase

Abstract: Generating shopping recommendations has become a classical problem in knowledge engineering with extensive practical applications. In this article we propose a system for the generation of such recommendations based on considering both local and global influences.

Keywords: Shopping recommendation, frequent sets, clustering, self-organizing maps

Title of the Paper: Equilibrium Dynamic Systems Intelligence


Authors: Marius-Constantin Popescu, Onisifor Olaru, Nikos Mastorakis

Abstract: Most work in Artificial Intelligence reviews the balance of classic game theory to predict agent behavior in different positions. In this paper we introduce steady competitive analysis. This approach bridges the gap between the standards of desired paths of artificial intelligence, where a strategy must be selected in order to ensure an end result and a balanced analysis. We show that a strategy without risk level is able to guarantee the value obtained in the Nash equilibrium, by more scientific methods of classical computers. Then we will discuss the concept of competitive strategy and illustrate how it is used in a decentralized load balanced position, typical for network problems. In particular, we will show that when there are many agents, it is possible to guarantee an expected final result, which is a 8/9 factor of the final result obtained in the Nash equilibrium. Finally, we will discuss about extending the above concept in Bayesian game and illustrate its use in a basic structure of an auction.

Keywords: Artificial intelligence, Nash equilibrium, Bayesian game

Title of the Paper: Resonance and Friction Compensations in a Micro Hard Drive


Authors: Wilaiporn Ngernbaht, Kongpol Areerak, Sarawut Sujitjorn

Abstract: This paper presents dynamic compensations in a micro hard drive. It reviews dynamic models of the drive in low-and high-frequency regions. The nonlinear friction compensation for low-frequency dynamic is achieved via a fuzzy logic controller. Forward and backward micro-step motion of the read/write head can be nicely performed. Resonance compensation for high-frequency dynamic is achieved via linear compensation. The paper presents some comparison studies of using a cascaded lead compensator, a complex lead-lag compensator, and a searched polynomial compensator. The compensated system’s performance is enhanced further by stable nonlinear control. Detailed descriptions of modelling, design, simulation results, and analysis can be found in the paper.

Keywords: Resonance, Friction, Compensation, Adaptive tabu search, Fuzzy logic, Hard drive

Title of the Paper: Lung Area Extraction from X-Ray CT Images for Computer-Aided Diagnosis of Pulmonary Nodules by using Active Contour Model


Authors: Noriyasu Homma, Satoshi Shimoyama, Tadashi Ishibashi, Makoto Yoshizawa

Abstract: In this paper, we develop a lung area extraction technique from X-ray computed tomography (CT) images for computer-aided diagnosis (CAD) systems. In lung cancer cases, pulmonary nodules are typical pathological changes and thus they are the target to be detected by CAD systems. The isolated nodules can be detected more easily by CAD systems developed previously, while previous CAD systems are often hard to detect non-isolated nodules. The extraction technique can then be used for transforming the non-isolated pulmonary nodules connected to the walls of the chest into isolated ones. The technique proposed here is based on an active contour model, but such model is often trapped into a local optimum solution. To avoid the local optimum solutions, an essential core of the proposed technique is to select an appropriate initial contour by using an anatomical feature of the lung shape in X-ray CT slices. Some experimental results demonstrate the usefulness of the proposed technique for assisting the CAD systems to detect non-isolated nodules more accurately.

Keywords: Computer aided diagnosis, Active contour model, Pulmonary nodules, Anatomical feature, and X-ray CT images

Title of the Paper: An Application of Fuzzy Delphi and Fuzzy AHP on Evaluating Wafer Supplier in Semiconductor Industry


Authors: Jao-Hong Cheng, Chih-Ming Lee, Chih-Huei Tang

Abstract: Because of the pressure of globalization in the last two decades, professional services has become an important strategic decision so that supplier selection is a prime concern. In the semiconductor industry, the prior researches worked on analyzing and improving the process, and evaluating the equipment manufacturers. Therefore, being the semiconductor industry applying a wide huge of advanced technologies, wafer suppliers and foundry and DRAM manufacturers acquire a large volume of critical materials and components. Consequently, this study is to identify critical factors related to the wafer supplier selection. It also has become a new subject by how to prompt current position of semiconductor industry and their wafer supplier in Taiwan. Primary criteria to evaluate supplier selection is acquired by the literatures survey, wafer manufacturers’ data and applying fuzzy Delphi method (FDM), and then fuzzy analytic hierarchy process (FAHP) is employed to calculate the weights of these criteria, so as to establish the fuzzy multi-criteria model of wafer supplier selection. The results indicated a greatest weight on the dimension of wafer supplier selection, and seven critical criteria related to wafer supplier selection were: (1) wafer quality, (2) delivery time, (3) service, (4) price, (5) process capability, (6) reputation, and (7) past performance.

Keywords: Wafer Supplier, Supply Chain, Semiconductor, Analytic Hierarchy Process (AHP), Fuzzy Delphi Method (FDM), Fuzzy Analytic Hierarchy Process (FAHP), Fuzzy Multi-Criteria Decision Making (FMCDM)

Title of the Paper: Image Compression via Textual Substitution


Authors: Bruno Carpentieri

Abstract: Textual substitution methods, often called dictionary methods or Lempel-Ziv methods, after the important work of Lempel and Ziv, are one-dimensional compression methods that maintain a constantly changing dictionary of strings to adaptively compress a stream of characters by replacing common substrings with indices (pointers) into a dictionary. Lempel and Ziv proved that the proposed schemes were practical as well as asymptotically optimal for a general source model. Two-dimensional (i.e. images) applications of textual substitution methods have been widely studied in the past. Those applications involve first the application of a linearization strategy to the input data, and then the encoding of the resulting monodimensional vector using LZ type one-dimensional methods. More recent strategies blend textual substitution methods with Vector Quantization. In this paper we discuss the textual substitution methods for image compression, with particular attention to the AVQ class of algorithms, and review recent advances in the field.

Keywords: Data Compression, Dictionary Compression, Image Compression, Vector Quantization

Title of the Paper: PCB Inspection for Missing or Misaligned Components using Background Subtraction


Authors: K. Sundaraj

Abstract: Automated visual inspection (AVI) is becoming an integral part of modern surface mount technology (SMT) assembly process. This high technology assembly, produces printed circuit boards (PCB) with tiny and delicate electronic components. With the increase in demand for such PCBs, high-volume production has to cater for both the quantity and zero defect quality assurance. The ever changing technology in fabrication, placement and soldering of SMT electronic components have caused an increase in PCB defects both in terms of numbers and types. Consequently, a wide range of defect detecting techniques and algorithms have been reported and implemented in AVI systems in the past decade. Unfortunately, the turn-over rate for PCB inspection is very crucial in the electronic industry. Current AVI systems spend too much time inspecting PCBs on a component-bycomponent basis. In this paper, we focus on providing a solution that can cover a larger inspection area of a PCB at any one time. This will reduce inspection time and increase the throughput of PCB production. Our solution is targeted for missing and misalignment defects of SMT devices in a PCB. An alternative visual inspection approach using color background subtraction is presented to address the stated defect. Experimental results of various defect PCBs are also presented.

Keywords: PCB Inspection, Background Subtraction, Automated Visual Inspection

Title of the Paper: An Automated Ligand Evolution System using Bayesian Optimization Algorithm?


Authors: Masaharu Munetomo, Kiyoshi Akama, Haruki Maeda

Abstract: Ligand docking checks whether a drug chemical called ligand matches the target receptor protein of human organ or not. Docking by computer simulation is becoming popular in drug design process to reduce cost and time of the chemical experiments. This paper presents a novel approach generating optimal ligand structures from scratch based on de novo ligand design approach employing Bayesian optimization algorithm to realize an automated design of drug and other chemical structures. The proposed approach searches an optimal structure of ligand that minimizes bond energy to the receptor protein, and the structure of ligand is generated by adding small fragments of molecules to the base structure. The decision of adding fragments are controlled by Bayesian optimization algorithm which is considered as a promising approach in probabilistic model-building genetic algorithms. We have built a system that automatically generates an optimal structure of ligand, and through numerical experiments performed on a PC cluster, we show the effectiveness of our approach compared to the conventional approach using classical genetic algorithms.

Keywords: Automated drug design, ligand docking, screening, de novo ligand design approach, probabilistic model-building genetic algorithms, estimation of distribution of algorithms, Bayesian optimization algorithms

Title of the Paper: Technical Solutions for Integrated Trading on Spot, Futures and Bonds Stock Markets


Authors: Vlad Diaconita, Ion Lungu, Adela Bara

Abstract: This article is an extended version of a paper presented in the WSEAS MCBE09 conference [10] in which we will present in more detail practical solution for building an integration tier between an online trading platform and two stock exchange markets, in a SOA like architecture. Our solution is constructed using XML, Java and PL/SQL and not third-party costly solutions. This open source approach, even if it is more difficult to develop and implement at first, helps a company to control the solution. The system will not be tied to SOA vendors that are usually keeping their software as secret as possible, demanding that they will develop, an additional costs, all the future developments. We are using XML for communication, not only because of the constraints given by the systems we integrate, but also because it is the natural choice in a SOA like solution. XML it’s being used to enable web services and similar, often custom, RPC functionality to allow greater access to data across multiple systems within an organization and allowing the possibility of future systems to be created from collections of such RPC functionality [5, 6].

Keywords: XML, integration, SOA, PL/SQL, Java, Threads, Spot Market, Futures Market

Title of the Paper: An Optimized Location-based Mobile Restaurant Recommend and Navigation System


Authors: Zhi-Mei Wang, Fan Yang

Abstract: With the widely used of the intelligent mobile phones with the GPS, the location-based services has become the a hot issue of mobile communications research. This paper implements a Mobile Location-based Restaurant Navigation and Recommend System. In order to improve server-side response speed for real-time query, we propose a memory pool model, the expansion Accept command, no-data client polling and interrupt mechanism, which aims to greatly optimize the server-side control procedures. On the client side, we combine the latest Web2.0 application data with the location-based data, and propose a collaborative assessment and recommend mechanisms, which can provide users with real-time location-based restaurant and recommend personalized navigation.

Keywords: Mobile Information Share, GPS, Web2.0, Location Based Service (LBS), Tagging, Collaborative Filtering, Personalized Recommendation

Title of the Paper: Spectral Representations of Alpha Satellite DNA


Authors: Petre G. Pop

Abstract: Detection of tandem repeats can be used for phylogenic studies and disease diagnosis. The numerical representation of genomic signals is very important, as many of the methods for detecting repeated sequences are part of the DSP field. These methods involve the application of a kind of transformation. Applying a transform technique requires mapping the symbolic domain into the numeric domain in such a way that no additional structure is placed on the symbolic sequence beyond that inherent to it. Here we investigate the application of spectral analysis and spectrograms using a novel numerical representation to identify and study alpha satellite higher order repeats in human chromosomes 7 and 17.

Keywords: Sequence Repeats, DNA Representations, Alpha Satellite DNA, Spectral Analysis, Spectrograms

Title of the Paper: Block-Based Motion Estimation Analysis for Lip Reading User Authentication Systems


Authors: Khaled Alghathbar, Hanan A. Mahmoud

Abstract: This paper proposes a lip reading technique for speech recognition by using motion estimation analysis. The method described in this paper represents a sub-system of the Silent Pass project. Silent Pass is a lip reading password entry system for security applications. It presents a user authentication system based on password lip reading. Motion estimation is done for lip movement image sequences representing speech. In this methodology, the motion estimation is computed without extracting the speaker’s lip contours and location. This leads to obtaining robust visual features for lip movements representing utterances. Our methodology comprises of two phases, a training phase and a recognition phase. In both phases an n x n video frame of the image sequence for an utterance (can be an alphanumeric character, word or a sentence in more complicated analysis) is divided into m x m blocks. Our method calculates and fits eight curves for each frame. Each curve represents motion estimation of this frame in a specific direction. These eight curves are representing set of features of a specific frame and are extracted in an unsupervised manner. The feature set consists of the integral values of the motion estimation. These features are expected to be extremely effective in the training phase. The feature sets are used to characterize specific utterances with no additional acoustic feature set. A corpus of utterances and their motion estimation features are built in the training phase. The recognition phase is accomplished by extracting the feature set, from the new image sequence of lip movement of an utterance, and compare it to the corpus using the mean square error metric for recognition.

Keywords: Lip reading, Speech recognition, Motion estimation, User authentication, Feature Extraction

Title of the Paper: Agricultural Productivity Potential Assessment by Using Rainfall Contribution Index in Sub-Sahara Africa


Authors: Yu-Min Wang, Seydou Traore, Willy Namaona, Tienfuan Kerh

Abstract: Food deficit alleviation is the most important aspect for poverty reduction in the entire Sub-Sahara African (SSA) region. This alleviation can be achieved by increasing agricultural productivity. The deficit is in one way or the other attributed to inefficient and insufficient use of rainwater. This paper therefore, examines the scope for meeting the crop water demand under rainfed condition based on the suitable planting period approach in SSA, in order to take advantage of the favorable climatic condition. Climatic data collected from 1996 to 2005 in Ouagadougou and Ngabu, located in Burkina Faso and Malawi, respectively were used in this study. The rainfall contribution index and yield estimation model were introduced to examine the availability of rainwater to suffice the crop water demand and then predicting the yields. Rainfed agriculture in the study sites is characterized by a short and monomodal rainy season starting from May to September and November to April, in Ouagadougou and Ngabu respectively. Ngabu receives annually, an average of 912 mm rainfall, while Ouagadougou receives 698 mm. Based on the index, it has been realized that there is higher crop water requirement in Ouagadougou than in Ngabu regardless of the crop and planting dates. It was also observed that, the rainwater is more sufficient in Ngabu than Ouagadougou. Following the suitable planting periods determined in this study might increase the yield of maize, bean, millet, and groundnuts by 10.31, 16.22, 10.57, and 4.82 % in Ouagadougou, and 5.00, 7.41, 7.14, and 4.30 % in Ngabu, respectively. The suitable planting periods should therefore be recommended for reducing the gap between supply and demand for the selected crops under rainfed condition, so that crop productivity may increase.

Keywords: Agricultural water, food deficit, planting period, rainfall contribution index, effective rainfall, productivity

Title of the Paper: Database Analysis Models used for Studying the Residential Assemble Market


Authors: Mirela-Catrinel Voicu, Andreea Banciu, Mihai Dragota, Raul Turcu

Abstract: In this paper we present our own study on residential assemblies, starting from a particular set presented in [6]. We build a database and we present our results concerning these data. We present an algorithm for obtaining aggregated values sets. In order to exemplify its application we make a study on residential assemblies including apartments with three rooms.

Keywords: Residential assemblies, aggregated value sets, economical analyses, relational databases, programming environment

Title of the Paper: Management and Object Behavior of Statecharts through Statechart DNA


Authors: Benjamin De Leeuw, Albert Hoogewijs

Abstract: We propose composed strings called ”statechart DNA” as essential building blocks for a new statechart (sc) abstraction method. We define the simplified statechart (ssc) and show that our definition covers the UML 2.0 sc model, by matching it to all model elements of the StateMachine package of the UML 2.0 metamodel and to the OCL constraints on these model elements. A Model Driven Architecture (MDA) is defined, inspired by a PIM-to- PIM model transformation procedure between UML sc models and ssc models. We discuss the rationale behind action abstraction in ssc models. This framework is used to isolate sc DNA, first in ssc models, then in UML sc models. We show how sc DNA, a compaction of sc construction primitives, can be used to define behavior model metrics and more generally, to manage and maintain evolving object behavior. State machine versioning is an important application of statechart DNA to manage industrial model repositories.

Keywords: Statecharts, UML, Model checking, State machine versioning

Title of the Paper: Artificial Neural Networks application for Stress Smoothing in Hexaedrons


Authors: Leonardo Ivirma, Mary Vergara, Sebastian Provenzano, Francklin Rivas, Anna Perez, Francisco Fuenmayor

Abstract: In this paper it is presented the use of artificial neural networks to improve the tension fields obtained from the finite element discretization method. It was significantly reduced the time needed to reach solutions, with accuracy similar to the areas smoothing tensions methods: Superconvergent Patch Recovered (SPR) and Recovery by Equilibrium Patches (REP) improved. It is solved two cases that show the comparative advantages in terms of time spent by the neural network and the techniques described above for making improvements in the original solution: Artificial Neural Networks used only 7% and 70% respectively of the original time spent by the smoothing technique in such cases. As bigger is the magnitude of the problem, the greater the difference in the time required for the solutions, being better the neural network. Data used for this study come from cases of different features: with a smooth solution, a thick wall sphere exposed to inner pressure and one with singularities, a plate loaded with a lateral crack.

Keywords: Superconvergent Patch Recovery, Stress Smoothing, Artificial Neural Networks

Issue 6, Volume 6, June 2009

Title of the Paper: Organizational Structural Strategies in Risk Management Implementation: Best Practices and Benefits


Authors: Noor Habibah Arshad, Azlinah Mohamed, Ruzaidah Mansor

Abstract: When organizations embark on the implementation of IS projects, organizations need to be aware of the potential risks associated with the IS project and they should practice the risk management in mitigating the risk. Risk management needs to address all factors such as organizational, human, process and operational that can affect project success. Hence, risk management is essential for the successful delivery of Information System (IS) projects. Through risk management best practices, it provides guidance to strategies and adopt more consistent and systematical risk management approach and risk methodology for mitigating risk. Therefore, the aim of this study is to explore the risk management best practices and the benefits of best practices in Information System (IS) projects in Malaysian Information Technology (IT) industry particularly on organization structural strategies. The primary data for this research was collected by means of an interview, observations, and document reviews conducted at eleven private and public organizations in Malaysia. The findings from this research showed that with the organizational structural strategies, it aligns the organization strategies, technology and knowledge. Furthermore, the establishment of risk management practices is a strategic mechanism in managing and controlling IS project risk.

Keywords: Information System; Risk Management; Best Practices; Benefit of Best Practices; Organizational Structural

Title of the Paper: IT Outsourcing: An Exploratory Study Based on Transaction Cost Theory, Relational Exchange Theory and Agent Theory


Authors: Syaripah Ruzaini Syed Aris, Noor Habibah Arshad, Azlinah Mohamed

Abstract: The bandwagon effect of Kodak and IBM agreement has lead to more organizations to involve in IT outsourcing. Even though that is the case, IT outsourcing is not a panacea. Many researchers have been trying to come out with ways to effectively manage IT outsourcing. Some of them are anecdote, some with theoretical foundations and some are supported with experimental results. Still, there is a need to explore what is currently being a practice so as to identify the weaknesses and vulnerabilities. As a developing country, the acceptance towards the effectives of theory to support IT outsourcing remains a questions. Therefore, besides assessing best practices, this paper also reviews the practices of three theories namely Transaction Cost Theory, Relational Exchange Theory and Agent Theory. In order to achieve the objective, exploratory, qualitative research method was used in this study. Nine organizations were selected as a sample. The result of the research shows that even though there are proper guidelines available, some of the organizations omit some important steps. More shocking, some organizations deny the importance of adapting theories in current IT outsourcing practices. As a consequence, some organizations encountered difficulty in managing their projects. For future works, the weaknesses and vulnerability of current practices will then be enhanced and a framework for managing IT Outsourcing will then be proposed.

Keywords: IT outsourcing, Transaction Cost Theory, Relational Exchange Theory, Agent Theory, Initial Review, Tender Evaluation, Contract Management, Project Roll-on

Title of the Paper: IT Governance Mechanisms in Managing IT Business Value


Authors: Mario Spremic

Abstract: Most organizations in all sectors of industry, commerce and government are fundamentally dependent on their information systems (IS) and would quickly cease to function should the technology (preferably information technology – IT) that underpins their activities ever come to halt [15]. The development and governance of proper IT infrastructure may have enormous implications for the operation, structure and strategy of organizations. IT and IS may contribute towards efficiency, productivity and competitiveness improvements of both inter-organizational and intra-organizational systems [1]. The business value derived from IT investments only emerges through business changes and innovations, whether they are product/service innovation, new business models, or process change. In this paper a newly concept of IT Governance and its mechanisms are explained in further details. IT Governance is the process for controlling an organization’s IT resources, including information and communication systems and technology [8]. According to the IT Governance Institute [10], IT governance can be seen as a structure of relationships and processes to direct and control the enterprise use of IT to achieve the enterprise’s goals by adding value while balancing risk vs. return over IT and its processes. While IT management is mainly focused on the daily effective and efficient supply of IT services and IT operations, IT governance is much broader concept which focuses on performing and transforming IT to meet present and future demands of business and the business’ customers. IT Governance may be implemented using its key mechanisms such as business/IT strategic alignment, value creation and delivery, risk management (value preservation), resource management and performance measurement. In this paper key analytical IT Governance mechanisms such as information system audit and IT risk management are explained in further details.

Keywords: IT Governance, Information System Audit, CobiT

Title of the Paper: Knowledge Transfer Success Factors in IT Outsourcing Environment


Authors: Azlinah Mohamed, Noor Habibah Arshad, Nurul Aisyah Sim Abdullah

Abstract: Due to the advancement of technology, organizations depends more and more on information technology in order to stay competitive. Thus, there is an increase of IT outsourcing (ITO) activities especially for a non IT companies. There are many types of IT outsourcing namely complete outsourcing, facility management outsourcing and system integration outsourcing. Many of these outsourcing projects are reported facing with risk due to the technology and market shift when exercising ITO. This include loss of knowledge when the outsourcing consumer (OSC) are left with little or no knowledge on the product developed, implemented and maintained by the outsourcing service provider (OSP). Thus, this paper addresses the importance to protect and ensure the organizational knowledge is transferred and shared throughout the organization in order to stay competitive. In doing so, some influencing factors were identified in the success of knowledge transfer processes (KTP) within the context of ITO environments. Seventeen attributes are proposed based upon the predetermined key factors. The key factors are Knowledge provider (vendor), Knowledge to be transfer, Knowledge Receiver (client) and the Knowledge Infrastructure. In order to validate the framework, data are collected using survey and later statistically analyzed. The refined framework incorporating best practices yielded four key factors with two influencing attributes specified for the first factor, three influencing attribute for the second factor, two attributes for the third factor and six attributes for the fourth factor. This framework can be seen as an integration of several important elements involved in KTP, which need to be considered as an important aspect in facilitating KTP in an ITO environment.

Keywords: IT Outsourcing; Knowledge Transfer; Knowledge Provider; Knowledge Receiver; Outsourcing Provider; Outsourcing Consumer

Title of the Paper: Digital Ecosystem Access Control Management


Authors: Ilung Pranata, Geoff Skinner

Abstract: The newly emerging concept of Digital Ecosystem (DE) has played a significant role in today’s technology, especially for Small and Medium Enterprises (SMEs) to adopt Information and Communication Technology (ICT) inside their businesses. DE reveals the opportunities to enhance the productivity and efficiency of each business transaction. Therefore, it will further contribute to the success of the enterprise’s businesses. Along with the advancement of DE technology, security has emerged as a vital element in protecting the resources and information for the interacting DE member entities in particular. However, current developments of such security mechanisms for protecting these resources are still in their infancy. This paper proposes a distributed mechanism for individual enterprises to manage their own authorization processes and information access permissions with the aim of providing rigorous protection of enterprise resources.

Keywords: Information management, authorization, authentication, access permissions, distributed resource protection

Title of the Paper: An Analysis of Different Variations of Ant Colony Optimization to the Minimum Weight Vertex Cover Problem


Authors: Milan Tuba, Raka Jovanovic

Abstract: Ant colony optimization (ACO) has previously been applied to the Minimum Weight Vertex Cover Problem with very good results. The performance of the ACO algorithm can be improved with the use of different variations of the basic Ant Colony System algorithm, like the use of Elitism, Rank based approach and the MinMax system. In this paper, we have made an analysis of effectiveness of these variations applied to the Minimum Weight Vertex Covering Problem for different problem cases. This analysis is done by the observation of several properties of acquired solutions by these algorithms like best found solution, average solution quality, dispersion and distribution of solutions.

Keywords: Ant Colony, Minimum Weight Vertex Cover, Optimization Problems, Population Based Algorithms

Title of the Paper: The Effects of Sunshine-Induced Mood on Bank Lending Decisions and Default Risk: An Option-Pricing Model


Authors: Jyh-Jiuan Lin, Jyh-Horng Lin, Rosemary Jou

Abstract: Even though psychological evidence and casual intuition predict that weather may lead to changes in equity returns, little attention has been paid to these changes through asset pricing mechanisms. This paper fills this gap by examining the effects of sunny weather enhanced upbeat mood on bank spread management and default risk. An option-based model of bank spread behavior is developed to study these closely related phenomena. The model is designed to indicate the fat tails of loan repayments caused by mood effects induced by good weather. With the good mood influences on bank lending, this paper shows that sunshine is negatively correlated with the default risk in equity returns.

Keywords: Default Risk, Sunny Weather, Upbeat Mood, Fat Tails

Title of the Paper: Rescue Plan, Bank Interest Margin and Future Promised Lending: An Option-Pricing Model


Authors: Jyh-Jiuan Lin, Ching-Hui Chang, Jyh-Horng Lin

Abstract: This paper examines a bank rescue plan for future lending. We demonstrate that an increase in the loans guaranteed by the government or in bank responsible for the first stake of any losses results in an increased interest margin. Eventually, the plan will be lifted when bank becomes healthy. The bank will keep its promise to increase its future lending at a reduced margin.

Keywords: Bank Rescue Plane, Interest Margin, Future Lending

Title of the Paper: Improvement of Document Understanding Ability through the Notion of Answer Literal Expansion in Logical-linguistic Approach


Authors: A. K. Rabiah, T. M. T. Sembok, B. Z. Halimah

Abstract: Document undestanding offer interesting alternative to the kinds of special-purpose, carefully constructed evaluations that have driven many recent research in language understanding. It involves the process of reading a specific text document and answer the questions about it, to demonstrate one’s understanding of the document by returning exact phrase answers. This research aims to implement proposed logical formalisms by expanding the notion of answer literal for understanding task such as question answering. This paper modify the skolem arguments to broaden the notion of answer literal to all context of question that conducted, including universal quantifier and ground term. There are two symbols, fn represents the quantified variable names, while gn represents ground term variable names. The expanding of the notion of answer literal enables the document to be tested by all context of question, including universal quantified and ground term variables. Both answers link to the concept of capability that is considered in this experiment.

Keywords: Document Understanding, Linguistic Query, Logical Approach

Title of the Paper: Integration of Heterogeneous In-service Training Data into a Nationwide Database


Authors: Lung-Hsing Kuo, Hung-Jen Yang, Hsieh-Hua Yang, Jui-Chen Yu, Li-Min Chen

Abstract: The integration of heterogeneous in-service training data offers possibilities to manually and automatically draw up new information of professional human resources, which are not available when using only a single data source. Furthermore, it allows for a consistent representation and the propagation of updates from one data set to the others. However, different acquisition methods, data schemata and updating cycles of the content can lead to discrepancies in professions and expertise accuracy and correctness which hamper the combined integration. To overcome these difficulties, appropriate methods for the integration and harmonization of data from different sources and different types are needed. In this study, twenty-five databases were integrated into one national level database. More than 220,000 K--12 teachers’ in-service training data were collected through out this heterogeneous databases integrating project. A unified subject-category was introduced based upon both the teaching professions and administration professions so as to absorb all twenty-five sources. The integrated feasibility was evaluated according to the efficiency and correctness.

Keywords: Integration, Nationwide databases, In-service training data, Heterogeneous database integration

Title of the Paper: Context-Based Rate Distortion Estimation and its Application to Wavelet Image Coding?


Authors: Hsi-Chin Hsin, Tze-Yun Sung

Abstract: Embedded image coding in wavelet domain has drawn a lot of attention. Among noteworthy algorithms is the embedded block coding with optimized truncation (EBCOT) algorithm, which has been adopted by the JPEG2000 standard. EBCOT is a two-tier algorithm. Tier-1 is composed of bit-plane coding followed by entropy coding. Tier-2 performs the so-called post compression rate distortion optimization (PCRD), which requires a large memory space for storing all the code streams of code blocks; however, some code blocks of less importance might not be needed for the optimal decoded image at a bit rate. To avoid waste of computational power and memory space, a simple context based rate distortion estimation (CBRDE) is thus proposed to arrange the scanning order of code blocks in an adaptive manner. The CBRDE algorithm is based on theMQ table of JPEG2000, which is available at both encoder and decoder. As a result, there is no need to store and transmit the rate distortion information of code blocks. Experimental results show that the rate distortion curves are almost convex; this demonstrates the potential of CBRDE for embedded wavelet image coding.

Keywords: Embedded image coding; wavelet transform; JPEG2000; EBCOT; PCRD; CBRDE

Title of the Paper: Classification of Wetland from TM Imageries based on Decision Tree


Authors: Yuan Hui, Zhang Rongqun, Li Xianwen

Abstract: The traditional method of application of remote sensing data for land cover mapping is the use of supervised classification and unsupervised classification. Decision tree, showing great advantages in remote sensing classification, is computationally fast, makes no statistical assumptions, and can handle data that are represented on different measurement scales. Decision tree classification has been successfully applied to many classification problems, but rarely applied to mapping of wetlands. In this study, decision tree was proposed to extract wetland from Landsat 5/Thematic Mapper (TM) imageries in a wide area of Yinchuan plain. Tasseled Cap (TC) transformation was used to identity the different wetland types and normalized difference vegetation index (NDVI) was computed to distinguish paddy wetland and lake wetland. Results from this analysis show that the decision tree has an outstanding performance compared with the supervised classification in maximum likelihood method. The overall accuracy of supervised classification is 64.60%, while that of decision tree classification was 83.80%. Besides, it appears that a decision tree combinations different useful knowledge is an effective and promising classification method.

Keywords: Classification methods, Decision tree, Wetland, Tasseled cap transformation, NDVI

Title of the Paper: Malay Document Analysis and Recognition


Authors: Norzaidah Md Noh, Mohd Rusydi Abdul Talib, Azlin Ahmad, Shamimi A. Halim, Azlinah Mohamed

Abstract: Malay Document Analysis and Recongition aims to extract digital malay documents automaticaly. These extracted documents are presented in the form of namely articles, newspapers and magazines. Over the years, Malay digital documents has increased and published on the world-wide-web (www) and consequently used by many organizations local and abroad. In this paper, we introduce the implementation of a tool for Malay language document identification in mono- and multi-lingual documents. The tool development includes a feature extraction and a neural network technique. The feature extraction consists of documents filtering, word matching and binary representation of varied length sentences from many types of documents including generic text files, MS Word files, Adobe PDF and HTML web pages. The neural network employs back propagation neural network (BPNN) algorithm with adjustable number of neurons and weights between input, hidden and output layer. A database was constructed consisting of 300 sentences of mono and multi-lingual documents. Experiments show average recognition rate of 90% accuracy in recognizing of Malay language documents, which has more than 80%, matched Malay words. Our tool is able to recognise Malay language documents with reasonable accuracy.

Keywords: Document processing, Language recognition, Backpropagation neural network, Document filtering, Word matching technique

Title of the Paper: Fast Information Retrieval from Web Pages


Authors: Hazem M. El-Bakry, Nikos Mastorakis

Abstract: In this paper, a new fast algorithm for information retrieval is presented. Such algorithm relies on performing cross correlation in the frequency domain between input data and the input weights of fast neural networks (FNNs). It is proved mathematically and practically that the number of computation steps required for the presented FNNs is less than that needed by conventional neural networks (CNNs). The main objective of Internet users is to find the required information with high efficiency and effectiveness. Finding information on an object’s visual features is useful when specific keywords for the object are not known. Since intelligent mobile agent technology is expected to be a promising technology for information retrieval, there is a number of intelligent mobile agent based-information retrieval approaches have been proposed in recent years. Here, the work presented in [25] for image-based information retrieval using mobile agents is greatly enhanced. Multiple information agents continuously traverse the Internet and collect images that are subsequently indexed based on image information such as the URL location, size, type and the date of indexation. In the search phase, the intelligent mobile agent receives the image of object as a query and searches the set of web pages that contain information about the object. This is done by matching the query to images on web pages faster than the work presented in [25]. Furthermore, by applying cross correlation, object detection becomes position independent. Moreover, by using neural networks, the object can be detected even with rotation, scaling, noise, distortion or deformation in shape.

Keywords: Fast information retrieval, Content-based image retrieval, Image clustering, and Intelligent mobile agent

Title of the Paper: Integration Agent Negotiation and Data Global Consistency forms Automatic and None Bullwhip Effect Supply Chain


Authors: Ching-Shun Hsieh, Jui-Wen Hung

Abstract: Supply chain management (SCM) aims to efficiently integrate suppliers, manufacturers, warehouses, and retailers, not merely to ensure that merchandise is produced and distributed in the appropriate quantities, to the right locations, and at the right time, but also to minimize system wide costs while satisfying customer requirements. With global transnational enterprise layout trend, data consistency convergence study is a key of improving competitions and Bullwhip Effect problem in the GSCM. Besides, data consistency issues might induce different dependency relations in different extranet applications and partial dependency to several distributed original data source (ODS). In this paper, we consider the problem and solve it using a service routable consistency framework (Distributed Heterogeneous Web Service Routing based Portal; DHWSRP). The model based on a scalable routing service algorithm that dynamic reconfiguration forwarding data path within hierarchical enterprise region portals. By ripple propagate updating; SCM copies that existed in heterogeneity of enterprise database with specified ODS partially dependency relationships can be automatically updated. About efficient integrate SCM partner, we propose a Dynamic Information Exchange Center (DIEC) for creating a dynamic supply chain network in an Internet environment. Agent technology supports users in negotiating with upstream suppliers or downstream demanders and making decisions regarding partner selection. The framework allows enterprises to find more opportunities to cooperate with other partners. In business, such a framework not only reduces purchase costs and saves time for enterprises in reaching agreement but also eliminate Bullwhip Effect problem in GSCM.

Keywords: SCM, Intelligent agent, Negotiation, Outranking methods, Bullwhip Effect, Global consistency, Ripple service routing

Title of the Paper: Dynamic Node Join Algorithm with Rate-Distortion for P2P Live Multipath Networks


Authors: Shyh-Chang Liu, Tsung-Hung Chen, Tsang-Hung Wu, Peter P. S. Wang

Abstract: The delivery of multimedia that efficiently maximizes its quality in changing network conditions is one of the most challenging tasks in the design of live streaming systems. This study attempts to improve current P2P (peer-to-peer) live streaming systems by allowing users to enjoy high-quality service under the limitations of network resources. The proposed improvement method involves summing up and analyzing the consideration factors and restriction factors involving live stream quality during system operations. The proposed R-D (Rate-Distortion) optimized dynamic nodes join algorithm is based on multipath streaming concept and receiver-driven approach. This distributed algorithm enables the system to evaluate the current network status, in order to optimize the end-to-end distortion of P2P networks. Results of this study demonstrate the effectiveness of the proposed approach.

Keywords: P2P live streaming, Rate-Distortion, multipath streaming, receiver-driven, end-to-end distortion

Title of the Paper: Theory of Multivalent Delta-Fuzzy Measures and its Application


Authors: Hsiang-Chuan Liu, Der-Bang Wu, Yu-Du Jheng, Tian-Wei Sheu

Abstract: The well known fuzzy measures, Lambda-measure and P-measure, have only one formulaic solution, the former is not a closed form, and the latter is not sensitive enough. In this paper, a novel fuzzy measure, called Delta-measure, is proposed. This new measure proves to be a multivalent fuzzy measure which provides infinitely many solutions to closed form, and it can be considered as an extension of the above two measures. In other words, the above two fuzzy measures can be treated as the special cases of Delta-measure. For evaluating the Choquet integral regression models with our proposed fuzzy measure and other different ones, a real data experiment by using a 5-fold cross-validation mean square error (MSE) is conducted. The performances of Choquet integral regression models with fuzzy measure based on Delta-measure, Lambda-measure and P-measure, respectively, a ridge regression model and a multiple linear regression model are compared. Experimental result shows that the Choquet integral regression models with respect to Delta-measure based on Gamma-support outperforms other forecasting models.

Keywords: Lambda-measure, P-measure, Delta-measure, Gamma-support, Choquet integral regression model

Title of the Paper: Replica Technique for Geometric Modelling


Authors: Hameed Ullah Khan

Abstract: Computer graphics incorporate almost everything represented on computers with the exception of text and sound. A graphic approach is developed for the construction of figures by using this replica technique. In this context, an intelligent algorithm is developed to construct geometric outlines. The algorithm is capable of performing other analysing functions such as filtering, prediction, recognition and translation.

Keywords: Computer Graphics, Computational Geometry, Digital Graphics

Issue 7, Volume 6, July 2009

Title of the Paper: Research on the Conceptualization model for Traceability System of Meat Food Quality Safety


Authors: Zhang Xiaoshuan, Zhang Jian, Zhang Hu, Mu Weisong

Abstract: The quality safety traceability of meat food has played important role in governmental law, food industry management strategies, and consumers. Many institution and software companies have developed a lot of traceability system. There is a growing perception to implementation of reusability, multi resource data fusion and information sharing of the different system. The conceptual model is the key basic for the system reusability. The paper porposes an conceptual model of traceability system via integrating Petri nets, FMECA, with fuzzy probability, theory of evidence. The traceable resource unit(TRU) is classified and the infromation flow is optimized. The result shows that the model can improve the reusability and sharing of traceability system, conconquently improve the efficiency and decision-making quality for fast position and forward early warning in meat food quality safety traceability.

Keywords: Meat food, quality safety, traceability, Petri Network, FMECA, information fusion

Title of the Paper: Modeling Method of Traceability System based on Information Flow in Meat Food Supply Chain


Authors: Zhang Hu, Zhang Jian, Shen Ping, Zhang Xiaoshuan, Mu Weisong

Abstract: Over the last decades, the food safety issues become more and more important, such as BSE crisis, genetically modified organism (GMO) illegal spreading and melamine-contamination event. The food industry established a traceability system to reduce the impact of food safety issues, and the system became an effective method of guaranteeing the food safety. The purpose of this paper is not only does the traceability system traces the process information but also reduce the batches of recall. We study the traceable information flow and risk transmission throughout food supply which contains raw material, process and distribution. A mathematic model based on dynamic programming was proposed to solve the risk transmission problem in a China dumpling factory, and Radio Frequency Identification (RFID) is used to identify and transfer traceable information in this study. The results show that factory can plan the produce s schedule effectively according the mathematic model to decrease redundancy of information, and the information flow model is the fundament of traceability system. Applying of RFID system can enhance the ability about information gathering and transmitting.

Keywords: Traceability, information flow, mathematical model, dynamic planning, food safety

Title of the Paper: A Study of Two Phases Heat Transport Capacity in a Micro Heat Pipe


Authors: Cheng-Hsing Hsu, Kuang-Yuan Kung, Shu-Yu Hu, Ching-Chuan Chang

Abstract: Present study modifies Cotter’s model by using the dimensionless liquid flow shape factor, 1 K , to predict the maximum heat transport capacity and to discus the effects of contact angle. The results indicated that as the dimensionless liquid flow shape factor, 1 K , decreases, the friction effects on the vapor-liquid interface flow, v L , increases, and the liquid flow influenced by the vapor flow also increases. The predicted maximum heat transport capacity agrees well with Babin’s experimental data of a copper-water micro heat pipe under, v L =1 and contact angle, α = 10o . In a micro heat pipe, the results indicated that both the maximum heat transport capacity and 1 K increases with a increasing contact angleα.

Keywords: Cotter’s model, Contact angle, Heat pipe, Heat transport capacity, Shape factor, Dimensionless

Title of the Paper: A Study on PID Control with Indirect Liquid/Steam Heating


Authors: Cheng Hsing Hsu, Kuang-Yuan Kung, Shu-Yu Hu, Gia-Chaun Kuo

Abstract: The purpose of this experimental study is to reach homogeneous temperature control effect on the heat plate. In this research, the heat plate is kept constant temperature by the state of dual-phases, i.e., the state of phase change between liquid and steam, so that together with a single-point PID temperature controller only, the homogeneous temperature condition can easily be achieved with a smaller range of variation. The liquid-vapor region can stably predict the temperature for specific pressure and the specific working fluid. The experimental results show that, as far as the homogeneous temperature is concerned, the best result we got for the temperature variation is within ±0.5, which is superior to other researches. The system ide?ntification can discover the similar system to reach as high as 96.46% of transfer function.

Keywords: PID control, Homogeneous temperature, System identification, Phase equilibrium, Temperature control, Semi-conductor

Title of the Paper: A Knowledge Community Website Mode Based Analysis System on Web Game


Authors: Huay Chang

Abstract: Owing to the mature developed skills of information technology and widely used of internet today. Many teenagers or students spend much of their time in internet. There is a high ratio of these users spending much of their time in playing games. And many of the students’ grades are far behind at school. Lots of their parents and teachers appear opposed objected their children involving in playing network games. In this paper the author present an ‘Apply Knowledge Community Website mode based analysis System on Web Game’ in which embed the history learning in one of the types of internet game: web game. The knowledge management is mode is employed in our research. The content of this research is centered on China History. After numerous simulations, we find that there is a huge ratio of the network game users are students. The topic of fighting, war or other topics those wouldn’t help the students in increasing knowledge creation. Therefore, in order to help the students gain real harvest in playing the network game. Our knowledge community website mode based analysis system on web game plays an important role in game world.

Keywords: Knowledge, game, web game, China history, community website mode based analysis system

Title of the Paper: Design and Implementation of Soil Spatial Variation Analysis System


Authors: Gao Lingling, Zhang Rongqun, Zhao Ming, Yuan Hui, Cai Simin

Abstract: The advantages of traditional integrated GIS(Geographic Information System) are integrate each component of GIS and form independent complete system, for example ArcGIS and MapGIS, but the system is complex, huge,difficult to install, leads to high cost and difficulty in integrate with other applications or system and so on. Component GIS that meet various professional needs not only can carry on reorganization freely and nimbly between each module, but also has a visualization interface and easy-to-use standard interfaces. Spatial variation research mainly has traditional statistics analysis and geostatistics. The advantage of geostatistics is not only consider the value of the sample size, but also pay attention to sample space position and its distance, that make up for the defect of traditional statistics analysis that ignore spatial position. Geostatistics has not been integrated with GIS is a severe flaw of GIS. So, we develop a light-duty Soil Spatial Variation Analysis System use VS 2005 in the support of ArcGIS Engine combined present needs. The article introduces the system's design ideas, the major function modules, the development and implementing; and use HeBei province Handan County soil data to do system test. The result shows that the system has a good operation efficiency and accuracy to meet the general soil spatial variation analysis need.

Keywords: Spatial gridding methods; Geostatistics; Soil Spatial variation analysis; Components development; ArcGIS Engine

Title of the Paper: Spatio-Temporal Location Simulation of Wetlands Evolution of Yinchuan City Based on Markov-CA Model


Authors: Zhang Rongqun, Zhai Huiqing, Tang Chengjie, Ma Suhua

Abstract: Geographical Information System (GIS) is currently still in description and processing the static spatial data, it is difficult to express the dynamic data effectively and can not achieve the spatio-temporal analysis of geographical processes. Classic prediction models of geographic processes are based on larger geographic unit or administrative unit as the research object, not applying the high-resolution spatial information and achieving visual expression of simulation results. The expression of time, space and state in GIS is discrete. Cellular Automata (CA) is a grid dynamic model whose spatial interaction and time cause-effect relationship are local; its expression for time, space and state also is discrete. CA and GIS can be seamlessly combined so that the two could complement each other in spatio-temporal modeling. Markov combined with Cellular Automata model, under the support of GIS, can match the results of the classic prediction model with the spatial information at the micro scale, and achieve the analog integration of time, property and space. In this paper the Landsat images of 1991, 1999 and 2006 are used as the basic information source. After established the classification system and interpretation signs, we can get wetlands landscape distribution maps the three periods of Yinchuan Plain. The wetlands landscape distribution maps of 1991 and 1999 are used to set up the initial state matrix and transition probability matrix of wetland landscape types of the Yinchuan Plain. The approach of establishing the Markov-CA model is proposed, the transition rules and technical system of establishing these rules are explained in details. Finally using the established analog system, the evolution of Yinchuan Plain wetlands distribution status is simulated and the wetlands distribution map interpreted from TM image of 2006 is regarded as a true value to complete the accuracy analysis. The study result shows that the simulation model established by this method is able to meet the requirement of wetlands evolution location simulation.

Keywords: Wetlands simulation, Spatio-Temporal location simulation, Markov-CA, transition probability transition rules, Yinchuan city

Title of the Paper: Classification of Wetland from TM Imageries based on Decision Tree


Authors: Yuan Hui, Zhang Rongqun, Li Xianwen

Abstract: The traditional method of application of remote sensing data for land cover mapping is the use of supervised classification and unsupervised classification. Decision tree, showing great advantages in remote sensing classification, is computationally fast, makes no statistical assumptions, and can handle data that are represented on different measurement scales. Decision tree classification has been successfully applied to many classification problems, but rarely applied to mapping of wetlands. In this study, decision tree was proposed to extract wetland from Landsat 5/Thematic Mapper (TM) imageries in a wide area of Yinchuan plain. Tasseled Cap (TC) transformation was used to identity the different wetland types and normalized difference vegetation index (NDVI) was computed to distinguish paddy wetland and lake wetland. Results from this analysis show that the decision tree has an outstanding performance compared with the supervised classification in maximum likelihood method. The overall accuracy of supervised classification is 64.60%, while that of decision tree classification was 83.80%. Besides, it appears that a decision tree combinations different useful knowledge is an effective and promising classification method.

Keywords: Classification methods, Decision tree, Wetland, Tasseled cap transformation, NDVI

Title of the Paper: Band Selection of Hyperspectral Images Based on Bhattacharyya Distance


Authors: Cai Simin, Zhang Rongqun, Cheng Wenling, Yuan Hui

Abstract: With the development of sensor technology, the spectral resolution of remote sensing image is continuously improved. The appearance of the hyperspectral remote sensing is a tremendous leap in the field of remote sensing. The increasing availability of hyperspectral data and image has enriched us with better and finer data and it also enable us a much stronger ability to identify features. However the approaches in the feature identify of hyperspectral images are not as successful as we thought. Too many bands and a large amount of data not only bring difficulties in data storage and transmission, but also bring new challenges in hyperspectral image processing technology, especially the hyperspectral image feature recognize. Band selection aims to recognize the features effectively. We should distinguish the features by utilizing their spectral curve properties. These curves are found to have important information to recognize the different land cover types. So it makes great sense to choose the best combination of many bands and form a new hyperspectral image space. This procedure is usually called features selection. Bhattacharyya-Distance is one of the commonly used methods. It is one kind of statistic distance. It can more reasonably measure the distance between different land types in super multi-dimensional space. The hyperspectral data used in this paper is obtained by the sensors OMIS (Operational Modular Imaging Spectrometer). In this paper, we propose a band selection method based on the Bhattacharyya distance. In the proposed method, we try to find the optimize band combination. We divide land types in the research area into five classes (the five classes are seawater, fishery, building, vegetation and crops); calculate the Bhattacharyya distance between the five class pairs. According the optimal band subset selected by the Bhattacharyya distance, we make a classification and evaluate the classification accuracy. Experimental results show that the proposed band selection method compares favorably with conventional methods.

Keywords: Hyperspectral images, Band selection, Bhattacharyya distance, Coincident spectral plot

Title of the Paper: A Service Platform Design for Affective Lighting System based on User Emotions


Authors: Byounghee Son, Youngchoong Park, Hyo-Sik Yang, Hagbae Kim

Abstract: The development of IT has changed not only computer techniques, but also human life and the environment. It provides us with more and more chances to interact with computers. While we feel much unease while communicating with computers, from device-centered automatic solutions, human emotion are considered as an intelligent solution has been investigated continuously. Skillfully dealing with emotions is important in realizing more natural communication between computers and human being. According to this idea, the importance of emotional technology has been raised. For investigating human emotion accurately, however a technique that is able to recognize data comprehensively, including various and complex user context data, the condition of the user space, and changes to this space, are in demand. Additionally, research on sensors which recognize various environments and studies of reasoning algorithms are necessary. In particular, research related to standard platforms and storage techniques for the recognition, reasoning, expression, and similar areas are necessary for extracting different personal attributes of variables such as gender, age, birth area, and race. This paper outlines the service platform design for affective lighting system that can determine the emotional condition of a user and supply light adequate to it after adaptively recognizing a user’s physiological response as well as the space in which a user is active.

Keywords: Affective lighting system, Intelligent LED (light emitting diode), ISO/IEC 11197, User-centric affective life, Service Platform, Affection, Emotions

Title of the Paper: Visualizing Patterns of Interview Conversations Regarding Students' Learning Difficulties in Statistical Concepts via QMDScaling Techniques


Authors: Zamalia Mahmud, Rosli Abdul Rahim

Abstract: Many attempts were made by researchers and practitioners to analyze interview conversations. Due to different variations in the interview analysis techniques, it can lead to inconsistencies in the interpretation of the interview conversations. Through QMDScaling techniques, not only that interview conversations can be interpreted precisely and accurately, patterns of the interview conversations can also be visualized and interpreted via a joint-dimensional space. One area which is gaining popularity in research involving the use of both qualitative and quantitative data is in the research of statistical education. On that note, the purpose of this paper is to describe the methodological techniques used in visualizing patterns of the interview conversations regarding students’ learning difficulties in basic statistical concepts as perceived by the statistics educators. The process involves reducing and categorizing the interview conversations into themes. This leads to the transformation of the themes into codes and finally transforming the themes into a binary matrix and into a joint-dimensional space. These techniques had shown that inconsistencies in the interpretation of the interview conversations can be avoided and it has helped to understand issues and reveal findings which could not be elicited and interpreted precisely through the conventional technique.

Keywords: Interview conversations, learning difficulties, statistical concepts, qualitative matrix, multidimensional scaling

Title of the Paper: Ontology-based Intelligent Retrieval System for Soil Knowledge


Authors: Zhao Ming, Zhao Qingling, Tian Dong, Qian Ping, Zhang Xiaoshuan

Abstract: With the development and popularization of Internet, The research focuses on how to get the requirement quickly and exactly from a large number of information. Using ontology provides a new intelligent searching method based on Web. In this paper, According to ontology theory of agriculture's characters and combining with the major of soil and agricultural chemistry, the retrieval system took the soil knowledge system as example, took native XML(eXtensible Markup Language)Database--Tamino as information navigation database. According the demands input by users, this system will display related information by tree and understand user's demands through clicks, primarily realize Web's intellective searching. This article still introduces the design and implement process of the intellective retrieval system, XML and JSP(Java Server Pages) technology in detail. The system application can be spread for other shared information resources retrieval, providing efficient and relevant services for users.

Keywords: Ontology, soil Knowledge system, intelligent retrieval

Title of the Paper: Learning Difficulties Diagnosis for Children's Basic Education using Expert Systems


Authors: Jose Hernandez, Gloria Mousalli, Francklin Rivas

Abstract: SEDA (Expert System for Learning Difficulties or “Sistema Experto de Dificultades para el Aprendizaje” in spanish), is a software designed using the Expert Systems design methodologies, which contain a knowledge base comprising a series of strategies for Psycopedagogy evaluation, as well as providing tools that allow the teacher to discuss psycofunctions and basic skills for learning. In the vast and complex world of educational work, every day it is highlighted the importance of special education in all its dimensions; across the time it have been better known learning problems and the inescapable responsibility of each specialist in making accurate diagnostics and prompt remedial action. Psycopedagogy evaluation for diagnosis becomes focus of this expert system, in response to the concern of many career teachers, who for various reasons expressed difficulty when preparing assertive diagnostic describing Learning Difficulties of their students.

Keywords: Expert Systems, learning difficulties, artificial Intelligence

Title of the Paper: Application of Alternating Group Explicit Method For Parabolic Equations


Authors: Qinghua Feng

Abstract: Based on the concept of decomposition, two alternating group explicit methods are constructed for 1D convection-diffusion equation with variable coefficient and 2D diffusion equations respectively. Both the two methods have the property of unconditional stablility and intrinsic parallelism . Numerical results show the two methods are of high accuracy.

Keywords: Alternating group method, parallel computing, explicit scheme, parabolic equation, finite difference

Title of the Paper: Face Recognition as an Airport and Seaport Security Tool


Authors: Jyri Rajamaki, Tuomas Turunen, Aki Harju, Miia Heikkila, Maarit Hilakivi, Sami Rusanen

Abstract: The transportation industries have been subjected to unprecedented scrutiny and regulatory mandates in the post-9/11 era. On the other hand, the inner border inspections were closed down in Europe with the Schengen agreement. Freedom of movement has brought new challenges to the authorities and transportation companies. Effective camera surveillance with a facial recognition system (FRS) could be a realistic solution. FRS requires camera(s) and a control device; a computer with special software. The software processes the material, face images, collected by the cameras. FRS has been used as monitoring and controlling tool in major events and border crossings. The aim of FRS is to maintain and improve safety and security in a cost efficient way by saving manpower. However, FRS is an additional security tool and therefore not to be trusted only. FRS is being used mainly as a verification method where the human face functions as an access or pin code. Optimal operational environment for FRS is a dry environment with stable illumination; most likely indoor environment is needed to guarantee the operational ability. Images of the faces should be collected in close distance and the persons, who are to be identified, should cooperate. FRS is composition of technical elements and applications which are commonly used in everyday life. Profiling the environment and setting reasonable aims, FRS could be used in various places. Hence FRS is challenging the traditional methods as a sophisticated security tool for the sophisticated situations. So far, the only operational FRS in Finland started in summer 2008 at Helsinki-Vantaa airport. This paper examines and collects experiences from the airport pilot project, from literature and by interviewing experts of the security and facial recognition field. The aim of the paper is to specify the desired goal state, how FRS could be applied as a new seaport and maritime security tool.

Keywords: Camera surveillance, Crime prevention, Face recognition, Facial recognition system, Maritime security, Port security

Title of the Paper: A Forwarding Station Integrated the Non-Confirmed Routing Protocol in Ad-hoc Wireless Sensor Networks


Authors: Ching-Mu Chen, Tung-Jung Chan, Tsair-Rong Chen

Abstract: An ad-hoc wireless sensor network organizes itself as a network that many sensor nodes automatically communicate each other in a certain area. Each sensor node consists of a transmitting unit, receiving unit, central processing unit, and battery unit. Because of the sensor node’s battery unit that may not be reachable or rechargeable, it is important for both the base station and sensor nodes communicating very well for less energy consumption of sensor nodes to extend the entire ad-hoc wireless sensor network lifetime. Moreover, in this paper, the ad-hoc wireless sensor network is divided into many clusters and every cluster contains only one cluster head. All wireless sensor nodes will transmit their messages to the cluster head where they belong to. Then, the cluster head will return a confirmed message back to every sensor. However, it dissipates much energy to retransmit a confirmed message from the cluster head to the sensor node. Since the base station is far away from the sensed area, it is necessary to have a forwarding station forwarding the message from the sensed area to the base station. Finally, this paper proposes the forwarding station integrated the non-confirmed routing protocol so that the network lifetime can be extended very well. Also, simulation results show the network lifetime extended well and the performance is much better.

Keywords: Energy consumption, Network lifetime, Ad-hoc wireless sensor networks, Confirmed message, Forwarding station, Routing protocol

Title of the Paper: Qualitative and Quantitative Analysis of Workflows Based on the UML Activity Diagram and Petri Net


Authors: Kwan Hee Han, Seock Kyu Yoo, Bohyun Kim

Abstract: Since business workflows are closely related to enterprise performance, successfule execution of workflow is a critical driving force for strategic advantages of enterprise. Therefore, it is quite essential to customer satisfaction and productivity enhancement that structural errors in workflow instances must be detected and their performance must be evaluated before their enactments. For a structural verification and performance evaluation of workflows, this paper integrates the strengths of UML activity diagram and Petri net, proposes the mapping scheme from UML activity diagram to Petri net, verify the structural errors of workflow using the reachability tree analysis, and finally predict the workflow performance before its execution using the business process simulation. Through the proposed workflow analysis procedure, workflow modelers in enterprises can analyze the qualitative and quantitative aspects of workflow in an integrated way.

Keywords: Workflow, UML activity diagram, Petri Net, reachability tree, structural verification, performance evaluation

Issue 8, Volume 6, August 2009

Title of the Paper: Identification of Learners' Attitudes Toward Statistics Based on Classification of Discriminant Function


Authors: Zamalia Mahmud

Abstract: This study had identified the profiles of statistics learners’ attitude toward statistics through the classification process of discriminant function. This multivariate technique method is used to profile the subjects’ attitude into either positive or negative attitude towards statistics. The study had characterized each profile of learners by relating to his/her perceived attitudes toward statistics, types of learners, mode of study, programme structure, age, gender and learners’ evaluation towards the statistics course. Learners’ attitudes toward statistics were measured using the Attitudes Toward Statistics (ATS) instrument which comprised four sub-scales or dimensions, namely, Affect, Cognitive Competence, Value and Difficulty. These variables are examined as predictors that discriminate learners with positive and negative attitudes toward statistics. The results indicate that learners with positive attitudes can be reliably distinguished from learners with negative attitudes toward statistics across the four ATS sub-scales, types of learners, mode of study and learner’s evaluation towards the course. The results would assist instructors to fine-tune their teaching methodologies to optimize the teaching and learning of statistics in the classroom.

Keywords: Discriminant function, statistics learners, attitudes toward statistics, profiles

Title of the Paper: Grey Group Model Forecasting of Quarterly Accounting Earnings


Authors: Zheng-Lin Chen, Chan-Chien Chiu, Chia-Jui Lai

Abstract: Forecasting quarterly accounting earnings is an important task in prospective analysis and seasonality is an important phenomenon in those data’s behavior. In this paper, we investigate the efficiency of applying grey group model to forecast the earning per share. Unlike traditional statistical model such as Foster Model, grey group model inheres not only easy to calculate but also few observations needed than statistics methods when model building. Furthermore, the model liberates the users from the sample data assumptions from statistics analysis such as identically independent distribution. A data sample with fifty firms trading on Taiwan Stock Exchange is employed here and the forecasting performances are compared with those obtained by Foster Model. The results demonstrate that the grey group model is a competitive and competent one in prospective analysis.

Keywords: Quarterly Accounting Earnings, Forecasting, Grey Group Model

Title of the Paper: WEB based Technology in Planning Sports Education in Primary Schools


Authors: Danimir Mandic, Dragan Martinovic, Dragoljub Visnjic

Abstract: This paper deals with modern technologies which can be used for realization of sport education in primary schools. The physical education syllabus is organized around thematic areas, one of which is the so-called sports and technique education. This programmed framework is defined by its own educational contents and is based on requirements which concern motor skills at different levels of difficulty. During their physical education, students gain certain motor skills which are relevant for their everyday life, work or sports and recreation trainings. This paper deals with a Data Base Management system which we create to measure, calculate and define the optimal exercises for sports and technique education in primary schools.

Keywords: WEB portal, Informational technology, Physical education, Syllabus,, Evaluation, Teaching

Title of the Paper: Design and Implementation of Photogrammetry Based Product Re-Realization


Authors: Jiacai Wang, Ichiro Hagiwara

Abstract: This study presents an experimental procedure of reverse engineering system for rapid modeling and manufacturing of products with complex freeform surfaces. Its ultimate purpose is to explore a fast, efficient, economical and practical route to reproduction of existing physical objects with digital camera and CNC machine tools for small and medium-sized manufacturers on a P2P-based multimedia collaborative environment. This paper focuses on developing a prototype of photogrammetry based 3D digitization, B-spline surface model reconstruction, and tool path generation based on Open CASCADE kernel. Experiments show that the method is applicable for reverse modeling of freeform objects with moderate accuracy requirements. Therefore, the presented schema is a valid alternative to laser scanning techniques and coded light-based triangulation approaches with an important reduction of the hardware costs and time.

Keywords: Reverse engineering, Photogrammetry, Point cloud, B-Spline, Tool path generation

Title of the Paper: A Graphical Method of Detecting Pneumonia Using Chest Radiograph


Authors: Norliza Mohd. Noor, Omar Mohd. Rijal, S. A. R. Abu-Bakar, Mohd. Iqbal, Gan Chew Peng

Abstract: An important ingredient of health care is the correct initial diagnosis of chest ailments using images of chest radiograph. This paper develops a simple graphical method to aid the initial screening and discrimination of pneumonia patients (PNEU) from pulmonary tuberculosis (PTB) patients, lung cancer (LC) patients and normal healthy individuals (NL). Approximate confidence regions using principal component methods on selected texture measures detect and discriminate PNEU from PTB, LC and NL. A brief simulation study indicates that the probability ellipsoid is robust to mild deviation from normality with the presence of two outliers. The main result of this study is that the PNEU-ellipsoid when applied to test data is capable of detecting pneumonia in the sense that 100% of NL, 85% of LC and 65% of PTB will be rejected when using contrast texture measures. Using the combination of twelve features gives similar results. Membership of PNEU-ellipsoid obtained from the chest radiograph image can be used as a useful first stage detection of pneumonia.

Keywords: Image manipulation and recognition, medicine, texture measures, statistical methods

Title of the Paper: Measurement of Service Effectiveness and Establishment of Baselines


Authors: Dzenana Donko, Ismet Traljic

Abstract: In this paper the concept of the measuring service effectiveness focusing on customer-defined quality, continual improvement, employee empowerment, and measurement-based management and feedback is described. Metrics must be developed based on the priorities of the strategic plan, which provides the key business drivers and criteria for metrics. We will define these metrics and relevant information for their evaluation and establishment of measurement criteria. The most difficult part in implementation of this approach is consolidation. Described framework will captures relationship between business and IT service and will identify quantitative technique for establishing baselines and discovering possible losses during deployment of processes.

Keywords: Service management lifecycle, service operation, balanced scorecard, strategic objectives

Title of the Paper: Web-based Multimedia for Distance Learning


Authors: Rong-Jyue Fang, Hua-Lin Tsai, Chi-Jen Lee, Yi-Hsing Chang

Abstract: Education is now engaged in a transitional period with respect to new technologies which is unique in its history. No previous technologies which have been proposed to “revolutionize” education have simultaneously had identical or equivalent revolutionary consequences for this world. Given the technological resources, dealing with the challenges to their effective utilization in education will focus on human-professional. To assist web-based multimedia for distance education, it is important to understand the characteristic and its educational concerns. This study was to identify characteristics of web-based multi-media for distance learning settings.

Keywords: Web-based multimedia, distance learning

Title of the Paper: Using Electromagnetic Radiation (EMR) and Continuous Vertical Electrical Sounding (CVES) to Locate Zones of Weakness for Submarine Groundwater Discharge (SGD)


Authors: Emad Akawwi, Abdallah Al-Zoubi, Maher Kakaish, Abdalrahman Abueladas

Abstract: The geophysical techniques; the electromagnetic radiation and vertical electronic sounding were used in this study for achieving our goals. The major objective of this study is to investigate and evaluate the active faults, fractured zones and the subsurface sinkholes along the Eastern shores of the Dead Sea. The faults and the fractures consider weakness zones of the submarine groundwater discharge. Then by determining the subsurface structures, features control the submarine groundwater discharge into the Dead Sea. The first EMR profile was started at coordinates 31° 410 28.1800 N and 35° 340 38.8700 E at the South Sweimah area at the northern part of the Dead Sea. Many fractured zones were found along this profile. A main highly fractured zone was observed at a distance between 790 and 1500 m to the north of the starting point. Another one was observed between 880 and 1000 m to the north of the starting point. A few sinkholes were observed between distances of about 890 m and 1265 m to the north of the starting point. The main sinkhole was recognized at a distance of 2718 to 2730 m to the north of the starting point. The second EMR profile was carried out at the coordinate of 31° 4362.700 N and 35° 35 33.400 E. The geological features were found at distances of 2455 to 2460 m from the starting point and many of the active fault and joints were found along this profile at distances intervals of about 310-315 m, 660-665 m and at 695-700 m from the starting point. The main interesting areas were found at the distance of 2440-2450, 2635-2643 m from the starting point. The main fractured zones were found between 2490 and 2505 m and from 2330 to the end of the profile at coordinates 31? 45? 45.07?? N and 35? 34? 33.48?? E. The vertical electrical sounding profile shows that The first shallow zone is located at offset between 0 and 40 m the second zone located at offset between 320 and 390 m and 12 m deep from the surface. The vertical distortion of resistivity values at horizontal distances 240 m represent may due to shallow fault.

Keywords: Electromagnetic Radiation, Fault, Sinkholes, Subsurface, Fractures, Dead Sea, vertical electrical sounding

Title of the Paper: Prediction of Grid-Photovoltaic System Output Using Three-variate ANN Models


Authors: Shahril Irwan Sulaiman, Ismail Musirin, Titik Khawa Abdul Rahman

Abstract: This paper presents the prediction of total AC power output from a grid-photovoltaic system using three-variate artificial neural network (ANN) models. In this study, two-hidden layer feedforward ANN models for the prediction of total AC power output from a grid-connected photovoltaic system have been considered. Three different models were configured based on different sets of ANN inputs. In addition, each model utilizes three types of inputs for the prediction. The first model utilizes solar radiation, wind speed and ambient temperature as its inputs while the second model uses solar radiation, wind speed and module temperature as its inputs. The third model uses solar radiation, ambient temperature and module temperature as its inputs. Nevertheless, all the three models employ similar type of output which is the total AC power produced from the grid-connected system. Data filtering process has been introduced to select the quality data patterns for training process, making only the informative features are available. Thus, the regression analysis and root mean square error (RMSE) performance of each model could be enhanced. After the training process is completed, the testing process is performed to decide whether the training process should be repeated or stopped. Besides selecting the best prediction model, this study also exhibits some of the experimental results which illustrate the effectiveness of the data filtering in predicting the total AC power output from a grid-connected system. Each ANN model was tested with Levenberg-Marquardt training algorithm and scaled conjugate gradient training algorithm to select the best training algorithm for each model. Fully trained ANN model should later be able to predict the AC power output from a set of un-seen data patterns.

Keywords: Artificial Neural Network (ANN), photovoltaic (PV), regression coefficient (R), root mean square (RMSE), prediction, solar radiation (SR), ambient temperature (AT), wind speed (WS), AC power

Title of the Paper: Students' Understanding of Plagiarism and Collusion and Recommendations for Academics


Authors: Zaigham Mahmood

Abstract: Plagiarism and collusion are forms of academic misconduct. In academic institutions, plagiarism is on the increase and with the ready availability of information though the Internet and essay mills, it is becoming an issue of concern. Although, plagiarism and collusion are regarded as academic offence with severe penalties, a large number of students are still committing this offence, sometimes with intent. This paper discusses how and why some students plagiarize, collude or collaborate (when collaboration is not allowed). It also details different forms of plagiarism and reports the results of an experiment to understand students’ understanding and perceptions. A questionnaire was constructed with a number of simple scenarios and students were asked to determine whether they were instances of plagiarism, collusion or collaboration. Results show that whereas students think they know what plagiarism is, they cannot always identify it when presented with different scenarios. The paper also mentions a number of plagiarism detection tools and suggests strategies for the lecturing staff to adopt - to deter students from submitting plagiarized work.

Keywords: Plagiarism, Collusion, Collaboration, Cyber plagiarism, Contract cheating

Title of the Paper: Development of Teaching and Learning Method on Islamic Pedagogy in West Africa


Authors: Maimun Aqsha Lubis, Mohammed Diao, Ramlee Mustapha, Ruhizan Mohd Yasin

Abstract: The influence of Islam in West African education systems was significant. The transformation from traditional Quranic schools to more modern Islamic education was slow but evident. However, there was a lack of empirical study on the implementation of Islamic pedagogy in West African countries. Therefore, this study was conducted to examine the development of Islamic teaching and learning in West African. Data was collected through questionnaire and observation. This study highlighted the role of the traditional of Islamic learning in West Africa, such as Timbuktu, Gene, Kanem and Bornun. The teacher needs to embrace the educational methodology and technology and able to apply ideas from various sources. Minority teachers agree with the use of traditional method and majority of them agree that traditional method in teaching and learning process needs to be developed. Workshop, seminar and discussion are the educational methodology and approach that can develop the traditional Islamic teaching.

Keywords: Islamic pedagogy, Quranic schools, West Africa, educational technology in Islamic pedagogy, educational development, traditional method, modern approach

Title of the Paper: Educational Technology as a Teaching Aid on Teaching and Learning Of Integrated Islamic Education in Brunei Darussalam


Authors: Maimun Aqsha Lubis

Abstract: This study was carried out to examine the extent of the implementation of the Integrated Islamic Education system and its enhancement on teaching and learning through technology. This study is a qualitative research into the implementation of educational technology in the teaching and learning processes of Integrated Islamic Education in Brunei Darussalam. The Pioneer Schools (Sekolah Rintis) and the Thoughtful School (Sekolah Hatiminda) are solid evidence that Brunei has decided to settle on the Integrated Islamic Education for its future generations. The analysis of the interviews and questionnaires conducted concluded that Educational Technology was already being utilized to some extent in the classroom. However, they did not realize that they have been using and applying Educational Technology daily. The teachers need to make the teaching and leraning process effective and interesting. The Educational Technology practiced in primary schools is slightly limited in the areas of application of teaching methodology and approach, application of media in teaching and management and administration of the research schools.

Keywords: Integrated Islamic Education, teaching and learning, educational technology, Brunei Darussalam, Islamic school, application of media in teaching and management

Title of the Paper: Teaching and Learning Process with Intergration of ICT - A Study on Smart Schools of Malaysia


Authors: Maimun Aqsha Lubis, Siti Rahayah Ariffin, Mohammed Sani Ibrahim, Tajul Ariffin Muhammad

Abstract: The mission 2020 basically is to develop a world class quality educational system which will realize the full potential of the individual and fulfil the aspiration of the ‘One Malaysian nation’. This will only happen if education departs from the disinterested rote learning mode, and explores how information technology can be used to encourage active, creative, and independent learning. Therefore, Malaysia needs to make the critical transition from an industrial economy to being a leader in the Information Age. In order to make this vision a reality, Malaysian Government took a strategic project through smart school. Smart Schools were identified as having a key role to increase the number of ICT-skilled to meet the demand of industry that would be intergrating ICT into their process. It is a need to make a fundamental shifts towards a more technologically literate and thinking workforce, able to perform in a global work environment and use the tools and technology available in the Information Age.

Keywords: Smart school, technology in education,ICT-Skilled people, Malaysia experience, classroom and technology, learning process

Title of the Paper: Educational Strategy And Technique in Teaching and Learning Islamic Education: Perception of African Teachers


Authors: Maimun Aqsha Lubis, Mohd Isa Hamzah, Mohd Arif Ismail, Nik Mohd Rahimi Nik Yusof

Abstract: There was a long history for educational strategy and technique in teaching and learning Islamic education in West Africa. The educational activities and cultural center in West Africa played a strategic role in introducing Islamic teaching. On the other hand, they also played a significant role in establishing great Islamic rulers and in extending the Islamic world to West Africa. This paper carried out a research on educational strategy and technique in teaching Islamic education from African teachers’ perception. It was conducted by a survey. The data was collected from questionnaires that were distributed to 83 teachers as respondents from Timbuktu and Gene in Mali, Kanem and Bornum in Nigeria. The research finding showed that five educational techniques and two educational strategies were effective in teaching and learning Islamic education. In addition, enhancement technologies through teaching and learning will contribute in two areas (i) ICT (information and communication technology) may be used as a medium in the teaching and learning to develop a more creative thinking in the education process, and (ii) Media is a form of teaching aids that assists teachers in the teaching, provide them with tools to illustrate some points or processes as well as to support long distance educational system. The perception of the teachers in term of educational strategy and technique of teaching and learning in the classroom and the 7 strategies/ technologies perceived by African teachers were effective. Therefore, African teachers have improved their ability to apply more effective strategy and technique in their teaching. On the part of the students, the importance of the educational technique and media are to enable them to associate the effective learning.

Keywords: Educational strategy and technique, Teachers’ perception, Teaching and learning in the classroom, African teachers, Islamic education

Title of the Paper: The Application of Multicultural Education and Applying ICT on Pesantren in South Sulawesi, Indonesia


Authors: Maimun Aqsha Lubis, Mohamed Amin Embi, Melor Md.Yunus, Ismail Suardi Wekke

Abstract: Multicultural education is one the most popular term used to describe education for pluralism. On the other hand, one of the best platforms to bring excellences in education is information and communication technology (ICT). Through applying ICT in application of multicultural education would enhance students’ capacity and accelerated pedagogy. This research described various application and processes in the pesantren that constructed to reduce discrimination with ICT assistance. In addition, it describes the well planed curriculum around concepts based on each ethnic from several different groups. The pesantrens are located in South Sulawesi, Indonesia. The research design is qualitative approach. Data collected from conducting interview and observational field data. Furthermore, data elaborated on grounded theories analysis. This study showed that stakeholders in pesantren work with content integration and various practices deal with the extent to which teachers use examples and content from a variety of culture to illustrate key concepts, principles and theories in their subject area or discipline. Then, teachers conceptualize multicultural education as content related to various ethnic and cultural groups. Furthermore, the processes on learning in multicultural education through applying ICT pointed out that it helped students to archive the education goal and objective. Finally, ICT extended teaching and learning processes and knowledgeable of the subject and the same time improved their capability to gain accelerated education in teaching and learning process.

Keywords: Pesantren, multicultural education, multicultural approach, ICT and education

Title of the Paper: Computational Requirement of Schema Matching Algorithms


Authors: Peter Martinek, Bela Szikora

Abstract: The integration of different data structures e.g. relational databases of information systems is a current issue in the area of information sciences. Numerous solutions aroused recently aiming to achieve a high accuracy in similarity measurement and integration of schema entities coming from different schemas. Researches usually properly evaluate the capabilities of these approaches from the point of view of accuracy. However the computational complexity of the proposed algorithms is hardly ever examined in the most of these works. We claim that efficiency of a solution can only be judged by taking into account both the accuracy and the computational requirements of participating algorithms. Since there are many known measurement methods and metrics for the evaluation of accuracy, the focus is set for the analysis of their computational complexity in this paper. After the problem formulation the main ideas behind our method are presented. Various approximation techniques and methods of applied algorithm theory are used to evaluate the different approaches. Three specific approaches were also selected to present the work of our method in details on them. Experiments run on several test inputs are also included.

Keywords: Computational complexity, Schema matching, Approximation techniques in computational requirement estimation

Title of the Paper: Social Phenomenon of Community on Online Learning: Digital Interaction and Collaborative Learning Experience


Authors: Karmela Aleksic-Maslac, Masha Magzan, Visnja Juric

Abstract: Digital interaction in e-learning offers great opportunities for education quality improvement in both - the classical teaching combined with e-learning, and distance learning. Zagreb School of Economics & Management (ZSEM) is one of the few higher education institutions in Croatia that systematically uses e-learning in teaching. Systematically means that all courses are developed combined with e-learning and all these courses use the same LMS (Learning Management System). Discussions are very important part of each e-learning system. The study focuses on the importance of the social phenomenon of community on online learning. The phenomenon of digital environments and social experience in education is examined through discussion boards of two different freshmen courses offered at ZSEM. Effectiveness and communication dynamics of discussion boards is analyzed through comparison of students’ participation rates according to the topic, discussion type and quality of discussion. The goal of the study is to analyze the potential of online communication tools in creating student-centered digital communities of inquiry. However, the focus is not on the individual student learning and achievement outcomes, but on the collaborative learning and student digital interaction from a pedagogical perspective. Based on social constructivist principle and the assumption that knowledge creation is a shared, rather than individual experience, the study examines how and why digital environments enhance online collaborative learning experience.

Keywords: Discussion boards, e-learning, quality, collaborative learning, online communities of inquiry, communities of learners, information and communication technologies, sociology

Issue 9, Volume 6, September 2009

Title of the Paper: 3D Spatial Touch System Based on Time-of-Flight Camera


Authors: Yang-Keun Ahn, Young-Choong Park, Kwang-Soon Choi, Woo-Chool Park, Hae-Moon Seo, Kwang-Mo Jung

Abstract: Recently developed Time-of-flight principle based depth-sensing video camera technologies provide precise per-pixel range data in addition to color video. Such cameras will find application in robotics and vision-based human computer interaction scenarios such as games and gesture input systems. Time-offlight principle range cameras are becoming more and more available. They promise to make the 3D reconstruction of scenes easier, avoiding the practical issues resulting from 3D imaging techniques based on triangulation or disparity estimation. We present a 3D interactive interface system which uses a depth-sensing camera to touch spatial objects and details on its implementation. We speculate on how this technology will enable new 3D interactions. This study implements a virtual touch screen that keeps track of the location of hand, inputted from the disparity image being outputted by a time of flight (TOF) camera, using the Kalman filter. Put out from the depth image of the TOF camera, this image is insensitive to light and therefore helps implement a virtual touch screen independent from the surroundings. The biggest problem with conventional virtual touch screens has been that even the slightest change in location led an object to fall out of or enter the virtual touch screen. In other words, the pointing location responded too sensitively to accurately detect the touch point. The Kalman filter, on the other hand, can be a solution to this problem, as it constantly predicts the pointing location and detects this touch point, without interruption, in response to even a slight change in location. This enables a stable and smooth change in the location of pointing point on the virtual touch screen.

Keywords: Depth Sensor, Virtual Touch Screen, 3D Interaction, Kalman Filter, Time-of-flight

Title of the Paper: A Cryptography Index Technology and Method to Measure Information Disclosure in the DAS Model


Authors: Zhao Wei, Zhao Dan-Feng, Gao Feng, Liu Guo-Hua

Abstract: Database-as-a-Service model is a new data management model which allow user to store their data at database service provider. In DAS model, data is stored in cryptograph form, so it will spend long time to query data. In order to improve cryptograph query efficiency, a cryptograph index strategy adapting to unequal- probability query is presented. Then, definitions of disclosure coefficient are presented focus on the problem of the information disclosure in cryptograph indices. Finally, the conclusion is analyzed and validated by experiments.

Keywords: Database-as-a-Service, Cryptography index, Query probability, Disclosure coefficient, Encrypted database system, Data cryptography

Title of the Paper: Language Learning via ICT: Uses, Challenges and Issues


Authors: Melor Md Yunus, Maimun Aqsha Lubis, Chua Pei Lin

Abstract: The rapid growth and improvement in Information Communication Technology (ICT) has lead to the diffusion of technology in education. It is believed that ICT would bring many advantages to the students if it is use under the right circumstances. Although ICT offer more advantages and flexibility, this type of learning environment may not be conducive for all learners. This paper describes the use of ICT for learning English, the challenges faced by the students in using ICT for learning English and their attitude towards the use of ICT in learning English among the urban school students in Kuala Terengganu, Malaysia. Data was collected via questionnaire survey of second language students. The results show that students are aware of the benefits of using ICT in learning language. However, students did not spend much of their time for the purpose of learning. Students spend only 1- 2 hours per weeks using ICT for learning activities. Most of the students use ICT for surfing internet to get information and for searching for words meaning and pronunciation. Students perceived themselves having high positive attitude towards the use of ICT in learning English. However, there are two main problems faced by the students which are lack of English proficiency and lack of training on ICT.

Keywords: Language learning, ICT pedagogy, technology in education, learning method, ICT in Learning

Title of the Paper: Vehicle Speed and Volume Measurement using Vehicle-to-Infrastructure Communication


Authors: Quoc Chuyen Doan, Tahar Berradia, Joseph Mouzna

Abstract: Intelligent transport system (ITS) refers to a system that manages road traffic using information and communications technology. One of the most important parts of it is the vehicle detection which provides vehicular data such as volume, density, and speed for traffic management centers. This paper proposes a vehicle detection method which measures volume and speed using Vehicle-to-Infrastructure communication and Global Positioning System. It can be implemented on roadway infrastructure with or without other vehicle detection techniques such as loop detectors.

Keywords: Loop detector, Traffic data, GPS, Ad-hoc, V2V, V2I

Title of the Paper: Considering Application Domain Ontologies for Data Mining


Authors: Filipe Mota Pinto, Manuel Filipe Santos

Abstract: The dramatically explosion of data and the growing number of different data sources are exposing researchers to a new challenge - how to acquire, maintain and share knowledge from large databases in the context of rapidly applied and evolving research. This paper describes a research of an ontological approach for leveraging the semantic content of ontologies to improve knowledge discovery in databases. We analyze how ontologies and knowledge discovery process may interoperate and present our efforts to bridge the two fields, knowledge discovery in databases and ontology learning for successful database usage projects.

Keywords: Ontologies, Knowledge Discovery, Databases, Data Mining

Title of the Paper: Computer-Assisted Instruction in Teaching Early Childhood Literature


Authors: Chew Fong Peng, Teh Ying Wah, Zahari Ishak

Abstract: This paper focuses on how computer-assisted instruction may help in teaching early childhood literature. Other than that, it also discusses the problems of developing and publishing early childhood literature in Malaysia and the solution. A survey was carried out to find out whether computer aided instruction is one of the best methods to help teaching early childhood effectively. A selected Malay folk tale, its exercises and game will be discussed throughout the paper.

Keywords: Computer-Assisted Instruction, Early Childhood Literature, Educational Games

Title of the Paper: A Relook at Logistic Regression Methods for the Initial Detection of Lung Ailments Using Clinical Data and Chest Radiography


Authors: Omar Mohd Rijal, Mohd. Iqbal, Ashari Yunus, Norliza Mohd. Noor

Abstract: The problem of diagnosing patients with lung ailments such as Tuberculosis (PTB), Pneumonia (PNEU) and Lung Cancer (LC) when making their initial visit to a medical institution is the focus of this study. Clinical data involving symptoms and signs are used to make important decisions before the availability of the results of further tests. In practice, Logistic Regression Methods are frequently involved in this type of decision making. However, the problem of missing values when the numerical values of certain explanatory variables are not available persists in practical situations. In this paper a logistic regression model using four variables (age, cough, loss of weight (LOW) and loss of appetite (LOA)) are investigated for each of the three diseases. The main result of this study is that the probability of misclassifying the three disease type is large, and that good model fitting does not guarantee correct diagnosis. As a viable substitute, a graphical method of detection with an 85% chance of correct classification based on information extracted from the chest radiograph images is proposed.

Keywords: Statistical detection, error probability, lung disease, clinical data, chest radiography, missing values

Title of the Paper: Nonce-Aware Encryption


Authors: Ming-Luen Wu

Abstract: As an alternative perspective on designing IND-CCA2 encryption, we introduce a new security notion, nonce-awareness, for encryption. An encryption scheme is nonce-aware if it is computationally infeasible to produce a valid ciphertext without knowing the associated nonce. We also show that two remarkable IND-CCA2 encryption schemes are nonce-aware.

Keywords: Encryption, Security, Nonce-awareness, Indistinguishability, Chosen-ciphertext attack

Title of the Paper: A Web Mining System


Authors: Jose Aguilar

Abstract: The Web Mining arises like an appropriate tool to exploit the derived knowledge of the web-user interaction, describing models that use patterns and characterize the profiles of the different groups of users which use Internet. To achieve this, currently there are numerous techniques. Some of these techniques are integrated in this work to build a Hybrid System of Web Mining that allows extracting useful information of the web users. Specifically, three techniques of the area of Web Mining were used: Sequential Patterns, Path Analysis and Cubes. The System obtains a group of access patterns from the users to a website, to arrange them in a multidimensional structure, called Cube. Using that, the system can discover correlations between the web pages and users' groups, behaviors of the web users, among other things.

Keywords: Web Mining, Sequential Patterns, Path Analysis, Cubes, Pattern Recognition, Data Mining

Title of the Paper: Developing a Question Answering System for the Slovene Language


Authors: Ines Ceh, Milan Ojstersek

Abstract: In today’s world the majority of information is sought after on the internet. A common method is the use of search engines. However since the result of a query to the search engine is a ranked list of results, this is not the final step. It is up to the user to review the results and determine which of the results provides the information needed. Often this process is time consuming and does not provide the sought after information. Besides the number of returned results the limiting factor is often the lack of ability of the users to form the correct query. The solution for this can be found in the form of question answering systems, where the user proposes a question in the natural language, similarly as talking to another person. The answer is the exact answer instead of a list of possible results. This paper presents the design of a question answering system in natural slovene language. The system searches for the answers for our target domain (Faculty of Electrical Engineering and Computer Science) with the use of a local database, databases of the faculty’s information system, MS Excel files and through web service calls. We have developed two separate applications: one for users and the other for the administrators of the system. With the help of the latter application the administrators supervise the functioning and use of entire system. The former application is actually the system that answers the questions.

Keywords: Question answering, Slovenian language, morphological dictionary of Slovenian language, Question Classification, machine learning, question templates, personalization

Title of the Paper: Local Search Engine with Global Content based on Domain Specific Knowledge


Authors: Sandi Pohorec, Mateja Verlic, Milan Zorman

Abstract: In the growing need for information we have come to rely on search engines. The use of large scale search engines, such as Google, is as common as surfing the World Wide Web. We are impressed with the capabilities of these search engines but still there is a need for improvment. A common problem with searching is the ambiguity of words. Their meaning often depends on the context in which they are used or varies across specific domains. To resolve this we propose a domain specific search engine that is globally oriented. We intend to provide content classification according to the target domain concepts, access to privileged information, personalization and custom ranking functions. Domain specific concepts have been formalized in the form of ontology. The paper describes our approach to a centralized search service for domain specific content. The approach uses automated indexing for various content sources that can be found in the form of a relational database, web service, web portal or page, various document formats and other structured or unstructured data. The gathered data is tagged with various approaches and classified against the domain classification. The indexed data is accessible through a highly optimized and personalized search service.

Keywords: Information search, personalization, indexes, crawling, domain specific crawling, natural language processing, content tagging, distributed data sources, ranking functions

Title of the Paper: A Correctness Criterion for Schema Dominance Centred on the Notion of ‘Information Carrying’ between Random Events


Authors: Junkang Feng, Kaibo Xu

Abstract: In systems development and integration, whether the instances of a data schema may be recovered from those of another is a question that may be seen profound. This is because if this is the case, one system is dominated and therefore can be replaced by another without losing the capacity of the systems in providing information, which constitutes a correctness criterion for schema dominance. And yet, this problem does not seem to have been well investigated. In this paper we shed some light on it. In the literature, works that are closest to this problem are based upon the notion of ‘relevant information capacity’, which is concerned with whether one schema may replace another without losing the capacity of the system in storing the same data instances. We observe that the rational of such an approach is over intuitive (even though the techniques involved are sophisticated) and we reveal that it is the phenomenon that one or more instances of a schema can tell us truly what an instance of another schema is that underpins a convincing answer to this question. This is a matter of one thing carrying information about another. Conventional information theoretic approaches are based upon the notion of entropy and the preservation of it. We observe that schema instance recovery requires looking at much more detailed levels of informational relationships than that, namely random events and particulars of random events.

Keywords: Database design, Schema dominance, Schema transformation, System integration, Information content, Information capacity

Title of the Paper: Searching Minimal Fractional Graph Factors by Lattice Based Evolution


Authors: Xiyu Liu, Yinghong Ma, Yangyang Zheng

Abstract: Spanning subgraphs is necessary for the communication in networks. Apart from theoretical existence results, effective technique of searching graph factors is an important problem in graph theory, complex networks, and applications which is NP hard. In this paper, we first propose a lattice based evolution technique. Then we present an evolutionary searching paradigm for the minimum fractal graph factors. A simple Markov analysis of the proposed genetic algorithm is given together with some experiments on the effects of parameters to the performance of the algorithms.

Keywords: Niche, evolution, lattice, minimum factors, graph

Title of the Paper: Using a Combined PLS Path and MCDM Approach in Determining the Effectiveness of Taiwan's Outward FDI Policy


Authors: Yi-Hui Chiang, Chih-Young

Abstract: Various studies have looked at outward foreign direct investment (FDI) from the host country perspective but have paid little attention to parent country determinants. Does the outward FDI policy (investment upper limit regulation) matter? In this paper, we propose a combined partial least squares (PLS) path model and multiple criteria decision-making (MCDM) approach to study Taiwan’s outward FDI to China. The main purpose of this study is to investigate the determinants of Taiwanese firms’ decisions in making FDI into China. Using data from Taiwanese optoelectronics firms doing business in Taiwan between 1998 and 2007), the results of the proposed model show that the outward FDI policy of the parent country is a key factor in Taiwan’s outward FDI into China. It is also found that the macroeconomic environment of the host country was a stronger determinant than the parent country on Taiwan’s outward FDI into China.

Keywords: Foreign direct investment (FDI), partial least squares (PLS), multiple criteria decision-making (MCDM), Parent country, upper limit regulation

Title of the Paper: The Factors Influencing Individual's Behavior on Privacy Protection


Authors: Sheng-Fei Hsu, Dong-Her Shih

Abstract: Individual’s behavior on privacy protecting is affected by not only the personal psychological factors, but also the external influences. However, the latter was always ignored in previous researches. For investigating how the external as well as internal factors simultaneously affected one’s privacy concern on privacy protection and restrained related behavior, this study applied perceived behavior control to modifying previous privacy model. In addition, a hypothesized model proposed to interpret how the related factors influenced individual’s behavior on privacy protection. The result of this study also indicated the significant relationships between personal privacy perception and perceived behavioral control.

Keywords: Privacy concern, Perceived privacy, Trust, Perceived behavioral control, Behavior on privacy-protecting, Theory of planned behavior

Title of the Paper: Towards an Understanding of the Behavioral Intention to Use Mobile Knowledge Management


Authors: Jeung-Tai E. Tang, Chihui Chiang

Abstract: The knowledge management can quickly acquired important knowledge way by using mobile equipments via wireless network. Many knowledge management activities occur in working places and in our daily activities. Mobility may be increased self belief and work efficiency by convenience knowledge management in the mobile environments. The recent years study showed that perceived ease of use and perceived usefulness constructs of Technology Acceptance Model have been considered important in determining the individuals’ acceptance and use of IT. In this study, we will introduce the convenience and the self-efficacy as new factors that reflect the characteristic of the mobile knowledge management. This paper addresses why users want be used mobile knowledge management and how user's adoption is affected by the convenience and the self-efficacy. The results showed that the research model fully mediated the usage behavior intentions even in the mobile knowledge management through the wireless network environment.

Keywords: Mobile knowledge management, Technology acceptance model, Convenience, Self-efficacy

Issue 10, Volume 6, October 2009

Title of the Paper: Geometry Modeling for Cotton Leaf Based on NURBS


Authors: Zhao Ming, Yang Juan, Zhang Xiaoshuan

Abstract: Digitalization of agriculture bring controllable industrial production and computer-aided design ideas into agriculture, NURBS curves and surfaces is described freedom of widely popular technology, CAD systems are widely used. Method can be easily applied to generate curves, surfaces with NURBS. In this paper, we have developed model of cotton organs using NURBS surface, present a introduction of Topological Structures on virtual plant from our personal perspective and simulated a 3-D growing of cotton with OpenGL and VC++ 6.0.briefly introduce method of simulating cotton organ based on NURBS and from our personal perspective present a introduction of Topological Structures on virtual plant. Based on VC++6.0 and OpenGL, methods of establishing models of main cotton organ using techniques of computer graphics are presented and realistic results have been achieved. Burls of stem and fruit branches are simulated by octahedronlike prism. Other organs are built based on NURBS. These organs are: bell?caulis leaf?fruit branch leaf?Petal?bract. Using NURBS we don’t need to obtain an abundance of accurate data and bother with regression equation by contrast with three-dimensional digital. Topological Structures of cotton is represented by C++ class: stem class and fruit branch class.

Keywords: NURBS, OpenGL, C++, Cotton, Topological structures

Title of the Paper: Adaptive Categorization in Complex Systems


Authors: Seyed Shahrestani

Abstract: A fast and reliable method for categorization of patterns that may be encountered in complex systems is described. Most pattern recognition and classification approaches are founded on discovering the connections and similarities between the members of each class. In this work, a different view of classification is presented. The classification is based on identification of distinctive features of patterns. It will be shown that the members of different classes have different values for some or all of such features. The paper will also show that by making use of the distinctive features and their corresponding values, classification of all patterns, even for complex systems, can be accomplished. The classification process does not rely on any heuristic rules. In this process, patterns are grouped together in such a way that their distinctive features can be explored. Such features are then used for identification purposes.

Keywords: Adaptive recognition, Categorization and classification, Distinctive Features, Complex Systems, Feature Extraction, Negative Recognition

Title of the Paper: OptimalSQM: Integrated and Optimized Software Quality Management


Authors: Ljubomir Lazic, Nikos E. Mastorakis

Abstract: Software testing provides a means to reduce errors, cut maintenance and overall software costs. Early in the history of software development, testing was confined to testing the finished code, but, testing is more of a quality control mechanism. However, as the practice of software development has evolved, there has been increasing interest in expanding the role of testing upwards in the SDLC stages, embedding testing throughout the systems development process. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. While it can be argued that there has been some improvement it is apparent that many of the techniques and tools are isolated to a specific lifecycle phase or functional area. This paper presents a set of best practice models and techniques integrated in optimized and quantitatively managed software testing process (OptimalSQM), expanding testing throughout the SDLC. Further, we explained how can Quantitative Defect Management (QDM) Model be enhanced to be practically useful for determining which activities need to be addressed to improve the degree of early and cost-effective software fault detection with assured confidence, then optimality and stability criteria of very complex STP dynamics problem control is proposed.

Keywords: Software Testing, Quality, Testing optimization, Cost of Quality, Testing stability criteria

Title of the Paper: Spatial Information on Site-Specific Seismic Response at Hongseong Damaged by 1978 Earthquake in Korea


Authors: Chang-Guk Sun

Abstract: Site characterization on geologic and soil conditions was performed for evaluating the site effects relating to the site-specific seismic response characteristics at a small urbanized area, Hongseong, in Korea, where structural damages were recorded by an earthquake of magnitude 5.0 on October 7, 1978. In the field, various geotechnical site investigations composed of borehole drillings and seismic tests for determining shear wave velocity (VS) profile were carried out at 16 sites. Based on the geotechnical data from site investigation and additional collection in and near Hongseong, an expert information system on geotechnical information was implemented with the spatial framework of GIS for regional geotechnical characterization across the entire study area. For practical application of the GIS-based geotechnical information system to estimate the site effects causing seismic hazards for the Hongseong area, spatial seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created at the area of interest. Furthermore, seismic zonation of site classification according to the mean VS to a depth of 30 m from ground surface was also performed for seismic design at any location in the area of interest. From the spatial geotechnical information and seismic zonations, the capability of seismic amplification was examined at plain and hill locations in Hongseong.

Keywords: Site effects, Geotechnical information, Seismic zonation, Site period, Geographic information system

Title of the Paper: Toward a New Paradigm: Mashup Patterns in Web 2.0


Authors: Cheng-Jung Lee, Shung-Ming Tang, Chang-Chun Tsai, Ching-Chiang Chen

Abstract: The advent of Web 2.0 changes the playing field drastically and opens a new era of platform opportunities. By opening the platform and making it remix, Google Maps changed the rules. Now, the rest of the industry is struggling to catch the world of web Mashups and Google’s momentum. However, what about the conceptual heart of other popular Mashups? We focus on looking for patterns in successful Web 2.0 Mashups. We systemizes concepts from Top 60 popular Mashups on ProgrammableWeb— and aim to present seven collection of Mashup Patterns: Data source Mashups, Process Mashups, Consumer Mashups, Enterprise Mashups, Client-Side Mashup, Server-Side Mashups and Developer assembly Mashups which build platforms to foster innovation in assembly, where remixing of data and services create new opportunities and markets. In this paper we highlight latest Mashup trends, exploration of the design principles of Mashups architecture, evaluation the most popular Mashups technology for the enterprise level, seven options patterns that exist for implementation, and capture the fundamental behavioral aspects of Mashups. What we present in this paper can be generalized for other Mashups.

Keywords: Mashup, Web 2.0, Pattern, paradigm, Google Maps, API

Title of the Paper: Recommendation Method that Considers the Context of Product Purchases


Authors: Tsuyoshi Takayama, Tetsuo Ikeda, Hiroshi Oguma, Ryosuke Miura, Yoshitoshi Murata, Nobuyoshi Sato

Abstract: We propose herein a technique for product recommendation in E-commerce by considering the context of product purchases, and verify the effectiveness of the technique through an evaluation experiment. Researchers have been aggressively studying techniques that can be used by stores to recommend to customers products that have relatively high purchase potential. Collaborative filtering is representative of conventional techniques. However, the collaborative filtering technique is based on the hypothesis that similar customers purchase similar products, and the context of product purchases is not considered in full. In the present study, a context matrix by which to manage the context history of product purchases is proposed. Collaborative filtering cannot distinguish the following two facts that `Product B was purchased after Product A’ and `Product A was purchased after Product B’. The context matrix, however, enables such information to be expressed and managed separately. We also propose four types of context matrix update methods which differs in subset selection on purchase history and user selection on obtaining purchase history.

Keywords: Recommendation, database, data mining, and E-commerce

Title of the Paper: A Case Study on Usability Metrics Applied in Romanian E-Commerce Environment


Authors: Dan-Andrei Sitar-Taut, Liana-Maria Stanca, Robert Buchmann, Ramona Lacurezeanu

Abstract: This paper extends the previous research regarding the Romanian electronic market as context for the launching a business opportunity in this domain. Here we will present a study on the indigenous e-commerce websites in general and their usability in particular. Nowadays, an impressive number of virtual businesses were open despite the crisis, since others closed their doors due to the same reason. Thus, before starting a new business, it is strongly recommended to make a deeper analysis of the environment and potential competitors, especially in the Web 2.0 context. Web 2.0 provides a winning edge over e-markets that lack social connectivity and high usability features. Usability cannot be quantified easily. Despite this fact, this paper will propose a usability evaluation methodology that will be tested in the local virtual space. For a visitor, usability is more related to a feeling than a figure, and the designer must power the website with it. We have also tackled a main branch of the electronic marketing, entitled SEO (Search Engineering Optimization). In order to promote our business and to put/keep it in the top of the search result list, a continuous optimization of the dynamic web site is required. We asses that usability and SEO must be both considered for the success of an electronic business. For our purposes, a statistical study was applied, using a battery of tests including the Kolmogorov-Smirnov, Spearman, and Kruskal-Wallis ones. Also, some data mining tools will be used to confirm or infirm the statistical assertions.

Keywords: Usability, SEO, search engines, data mining, Kolmogorov–Smirnov, Spearman, Kruskal-Wallis

Title of the Paper: Virtual Deformation of Soft Tissue using Bulk Variables


Authors: K. Sundaraj

Abstract: We present an alternative online simulation model for human tissue. Online simulation of human tissue deformation during surgical training or surgical assistance is becoming increasingly important within the medical community. Unfortunately, even classical simulation models find human tissue to be computationally too costly for online simulation. In this paper, we simplify the complex biomechanical nature of human tissue within reasonable limits to develop a mathematical model which can be used for online simulation. This simplification is based on two principles; volume conservation and Pascal’s Principle. Volume conservation is inherent to many organs in the human body due to the high concentration of blood (almost incompressible liquid) in them. Given an externally applied force, we use Pascal’s Principle to obtain the global deformation vector at each time-step during simulation.

Keywords: Virtual Reality, Soft Tissue Simulation, Surgical Simulators

Title of the Paper: Knowledge Induction from Medical Databases with Higher-Order Programming


Authors: Nittaya Kerdprasop, Kittisak Kerdprasop

Abstract: Medical data mining is an emerging area of computational intelligence applied to automatically analyze patients’ records aiming at the discovery of new knowledge potentially useful for medical decision making. Induced knowledge is anticipated not only to increase accurate diagnosis and successful disease treatment, but also to enhance safety by reducing medication-related errors. Modern healthcare organizations regularly generate huge amount of electronic data that could be used as a valuable resource for knowledge induction to support decision-making of medical practitioners. Unfortunately, a domain-specific decision support system that provides a suite of customized and flexible tools to efficiently induce knowledge from medical databases with representational heterogeneity does not currently exist. We, thus, design and develop a medical decision support system based on a powerful logic programming framework. The proposed system includes a knowledge induction component to induce knowledge from clinical data repositories and the induced knowledge can also be deployed to pre-treatment data from other sources. The implementation of knowledge induction engine has been presented to express the power of higher-order programming of logic-based language. The flexibility of our mining engine is obtained through the pattern matching and meta-programming facilities provided by logic-based language.

Keywords: Medical decision making, Medical informatics, Logic-based knowledge induction, Higher-order programming

Title of the Paper: Contaminants Analysis in Aircraft Engine Oil and its Interpretation for the Overhaul of the Engine


Authors: B. Leal, J. Ordieres, S. F. Capuz-Rizo, P. Cifuentes

Abstract: In this work, the authors will try to determine, by means of techniques based on Artificial Intelligence, the actual possibilities of using spectrometric analysis, done periodically to the engine oil, as predictors of the condition of the engine. Some similar works have led to a simple linear regression model, but in the particular case of this work it is intended to evaluate if the specific environmental conditions of the region and the low use of aircrafts allow discussing the convenience of making use of the same criterion or elaborating some specific variant.

Keywords: Oils contaminants, wearing models, artificial intelligence techniques

Issue 11, Volume 6, November 2009

Title of the Paper: Remote Updating Procedures for Mobile Point of Sale Terminals


Authors: Ales Zelenik, Zdenko Mezgec

Abstract: The following article presents an efficient remote updating system that supports software updating through different communication channels. The emphasis is on the robustness of transmission, data size optimizations, quick remote firmware replacing, etc. Besides GPRS and IP based communication, there is also explained a unique way of updating procedure with transmitting the data over speech channel of a mobile phone – DOV. All parts of updating system are thoroughly explained.

Keywords: Remote update, speech channel, Data over Voice – DOV, GSM, mobile POS terminals

Title of the Paper: Improved Mining of Software Complexity Data on Evolutionary Filtered Training Sets


Authors: Vili Podgorelec

Abstract: With the evolution of information technology and software systems, software reliability has become one of the most important topics of software engineering. As the dependency of society on software systems increase, so increases also the importance of efficient software fault prediction. In this paper we present a new approach to improving the classification of faulty software modules. The proposed approach is based on filtering training sets with the introduction of data outliers identification and removal method. The method uses an ensemble of evolutionary induced decision trees to identify the outliers. We argue that a classifier trained by a filtered dataset captures a more general knowledge model and should therefore perform better also on unseen cases. The proposed method is applied on a real-world software reliability analysis dataset and the obtained results are discussed.

Keywords: Data mining, classification, evolutionary decision trees, filtering training sets, software fault prediction, search-based software engineering

Title of the Paper: Electrical Energy Consumption Forecasting in Oil Refining Industry Using Support Vector Machines and Particle Swarm Optimization


Authors: Milena R. Petkovic, Milan R. Rapaic, Boris B. Jakovljevic

Abstract: In this paper, Support Vector Machines (SVMs) are applied in predicting electrical energy consumption in the atmospheric distillation of oil refining at a particular oil refinery. During cross-validation process of the SVM training Particle Swarm Optimization (PSO) algorithm was utilized in selection of free SVM kernel parameters. Incorporation of PSO into SVM training process has greatly enhanced the quality of prediction. Furthermore, various (different) kernel functions were used and optimized in the process of forming the SVM models.

Keywords: Support Vector Machines (SVM), Kernel Functions, Particle Swarm Optimization (PSO), Electrical Energy Prediction, Oil Refining

Title of the Paper: Method of Event Location Identification Using GPS and Camera Function of Mobile Phones


Authors: Kento Hirano, Yusuke In, Mayuko Kitazume , Masakazu Higuchi, Syuji Kawasaki, Hitomi Murakami

Abstract: In recent years, network applications with location-awareness have been attracting a lot of attention as a technical element for ubiquitous computing. Among such applications, those for environmental issues especially requires, for the sake of immediate detection and providing solutions, a high precision of auto-detected location information of relevant places. In order to realize the precision, technical challenges will be evaluation of the precision of GPS information and how to improve it. So far, these issues have rarely been studied, however. In this paper, we consider especially mobile applications on mobile phones, first to evaluate the precision of GPS information on mobile phones, and second to study how to improve the precision. According to these results, we discuss the possibility of applying location information on mobile phones to the environmental issues and future technical problems.

Keywords: GPS, Mobile Phone, Pattern matching, Correlation Coefficient

Title of the Paper: Applications of Genetic Algorithms


Authors: Marius-Constantin Popescu, Liliana Popescu, Nikos Mastorakis

Abstract: In this paper were presented the main directions of genetic algorithms. There is a large class of interesting problems that have not yet been developed fast algorithms. Many of these problems are problems which occur frequently optimized in applications. Genetic algorithms are part of Heuristic algorithms, applying them successfully if problems do not admit polynomial-time algorithms. Genetic algorithms, as the name suggests, are inspired from nature, specifically of the way through genetic recombination improves a species.

Keywords: Genetic operators, Genetic programming, Objective function

Title of the Paper: An RFID and Multi-agent Based System for Improving Efficiency in Patient Identification and Monitoring


Authors: Cristina Turcu, Tudor Cerlinca, Cornel Turcu, Marius Cerlinca, Remus Prodan

Abstract: Hospital today are under increasing pressure to increase the quality and efficiency of patient identification and monitoring procedures. Most patient health records are stored in separate systems and there are still huge paper trails of records that health-care providers must keep to comply with different regulations. This paper proposes an RFID-based system (named SIMOPAC) that integrates RFID and multi-agent technologies in health care in order to make patient emergency care as efficient and risk-free as possible, by providing doctors with as much information about a patient as quickly as possible. Every hospital could use SIMOPAC with their existing system in order to promote patient safety and optimize hospital workflow. In this paper we will concentrate on the RFID technology and how it could be used in emergency care in order to identify patients and to achieve real time information concerning the patients’ biometric data, which might be used in different points of the health system (laboratory, family physician, etc.). We describe a general purpose architecture and data model that is designed for collecting ambulatory data from various existing devices and systems, as well as for storing and presenting clinically significant information to the emergency care physician.

Keywords: E-health, electronic medical records, RFID, emergency care, embedded system

Title of the Paper: A Simultaneous Application of Combinatorial Testing and Virtualization as a Method for Software Testing


Authors: Ljubomir Lazic, Snezana Popovic, Nikos E. Mastorakis

Abstract: We propose in this paper a general framework for an integrated End-to-End Testing of IT Architecture and Applications using the simultaneous application of combinatorial testing and virtualization. Combinatorial testing methods are often applied in cases of the configuration testing. The combinatorial approach to software testing uses models, particularly an Orthogonal Array Testing Strategy (OATS) is proposed as a systematic, statistical way of testing pair-wise interactions to generate a minimal number of test inputs so that selected combinations of input values are covered. Virtualization, in the process of testing, is based on setting the necessary environment to multiple virtual machines, which run on one or in smaller groups of physical computers, which are: reduce the cost of equipment and related resources, reduce the time required to manage the testing process, and favors raising removal of test infrastructure. Together, combinatorial testing and virtualization presents practical approach to improving process of testing, through the balancing quality, cost and time.

Keywords: Software Testing, Virtual Machines, Environment Virtualization, Combinatorial testing

Title of the Paper: Bank Interest Margins under Information Asymmetry and Centralized vs. Decentralized Loan Rate Decisions: A Two-Stage Option Pricing Model


Authors: Jyh-Horng Lin, Jyh-Jiuan Lin, Rosemary Jou

Abstract: The behavior of outside portfolio managers of large money-center banks as analyzed to date is silent. In a centralized loan portfolio construction with decentralized loan portfolio management, changes in the bank’s degree of capital market imperfection have direct effects on the bank’s interest margin through the centralized as well as the decentralized loan rate determinations. We use a two-stage model of option-based analysis to study how information asymmetry and optimal bank loan rates relate to one another. We find that the decentralized loan rate managed by the outside loan manager is positively related to the bank’s degree of capital market imperfection. The centralized loan rate managed by the bank is positively related to its degree of capital market imperfection under strategic complements, but negatively under strategic substitutes.

Keywords: Centralized vs. decentralized loan rate; Information asymmetry; Black-Scholes formula

Title of the Paper: Artifacts Recovery and Understanding Using High Level Models


Authors: Nadim Asif, Faisal Shahzad, Najia Saher, Rafaquet Kazami, Waseem Nazar

Abstract: The systems are required to understand and present at higher levels of abstractions to perform changes or re-engineer to meet the current requirements. The software systems drift away from the existing implemented source code and the documentations due to the changes. The high level models are used for the purpose of recovering the artifacts and understanding the system to perform the maintenance activities. This paper presents an approach to develop the high level models from the existing source code and documents.

Keywords: Design Recovery, Re-Engineering, Reverse Engineering, , Program Understanding, Software Maintenance

Title of the Paper: Fuzzy Covering and Partitioning Problems Based on the Expert Valuations: Application in Optimal Choice of Candidates


Authors: Gia Sirbiladze, Bezhan Ghvaberidze, Anna Sikharulidze, Bidzina Matsaberidze, David Devadze

Abstract: A new approach in the discrete optimization (partitioning and covering) problems is presented based on the expert knowledge presentations. An a priori uncertain information on the alternatives is given by some probability distribution and an a priori certain information on the knowledge competitions is given by some weights. A new criterion is introduced for a minimal fuzzy covering or partitioning problem which is a minimal value of average misbelief in possible alternatives. A bicriterial problem is obtained using the new criterion and the criterion of minimization of average price of the covering or partitioning. The proposed approach is illustrated by an example for the partitioning problem.

Keywords: Minimal fuzzy covering or partitioning, Minimal compatibility level, Positive and negative discrimination measures, Average misbelief criterion, Bicriterial problem

Issue 12, Volume 6, December 2009

Title of the Paper: Relationship between Muscle Activities and Different Movement Patterns on an Unstable Platform using Data Mining


Authors: Jung-Ja Kim, Yong-Jun Piao, Ah Reum Lee, Tae-Kyu Kwon, Yonggwan Won

Abstract: Association rule mining is widely used in the market-basket analysis. The association rule discovery can mine the rules that include more beneficial information by reflecting item importance for special products. Association rule mining could be a promising approach for clinical decision support system by discovering meaningful hidden rules and patterns from large volume of data obtained from the problem domain. The objective of this study was to analyze the muscle activities of different movement patterns on a training system for posture control using an unstable platform through association rule mining methodology. In this research, in order to find relational rules between posture training type and muscle activation pattern, we investigated an application of the association rule mining to the biomechanical data obtained mainly for evaluation of postural control ability. To investigate the relationship of the different movement patterns and muscle activities, fifteen healthy young subjects took part in a series of postural control training using a training system that we developed. The electromyography of the muscles in the lower limbs were recorded and analyzed under the different movement patterns. An improved association rule mining methodology was applied to analyze the relationship of the movement patterns and muscle activities. The results showed the significant differences in muscle activities for the different movement patterns. The experimental results suggested that, through the choice of different movement pattern, the training for lower extremity strength could be performed on specific muscles in different intensity. And, the ability of postural control could be improved by the training for lower extremity strength. Through the analysis results, we tried to find the best training method to improve the ability of postural control through improving the lower extremity muscular strength. The discovered rules could be used as a more useful knowledge for the rehabilitation and clinical expert's.

Keywords: Muscle activity, Association rule mining, Unstable platform, Postural control, Training

Title of the Paper: Optimal Bank Interest Margin and Shareholder Interest Conflicts under CEO Overconfidence: A Constrained Option-Pricing Model


Authors: Jyh-Horng Lin, Peirchyi Lii, Rosemary Jou

Abstract: Less is known about how equity returns allocated between current and new shareholders are altered to react to chief executive officer (CEO) overconfidence. This paper uses a nonlinear constrained contingent claim methodology of Black and Scholes (1973) and Merton (1974) to explore interest conflicts between current and new shareholders when an overconfident bank CEO overestimates returns on investment projects, and sequentially raises too much in external funds when internal resources become scarce. We show that low levels of bank interest margins or equity returns, which decrease the claims of current shareholders, are associated with investment distortions; but high levels of bank equity returns, which dilute the claims of current shareholders, are associated with external financing distortion.

Keywords: CEO Overconfidence, Bank Margin, Call Pricing

Title of the Paper: An Improvement Framework for E-learning Processing Method Development at Centralized Organization Education


Authors: Ching-Chiang Chen, Dong-Her Shih, Hsiu-Sen Chiang, Cheng-Jung Lee

Abstract: In the past decade, E-learning has played an increasingly important role in modern education, Distance learning and E-learning are getting more attention, not only from enterprise organizations, but also from Seal and organizations such as the military. Thus, in the troops which is a Seal and centralization organization, the traditional education training should be changed and catch up the modern training method. This paper based on Web Service, that is by World Wide Web Consortium (W3C) Design and designated which is used for promoting and stepping the communication among the platforms. In addition, we using the TAM model with Perceived Usefulness and Ease of use and the user believed that the IT will strengthen his task performance in the degree. Based on the exploratory study in this research, we use interview and survey methods to collect data and analysis the research field explore viewpoint. The purpose of this study is three-fold. First, it heeds the call for theoretically-based empirical research on the TTPi. Second, it, examines the C Troop choice and usage perspectives in combination in an attempt to find their individual and combined effects on E-learning choice and usage. Finally, it attempts to synthesize research in this area by developing a model showing how TTPi influences troops. The results of this study lead us to develop some propositions and develop a Troop Training Process improvement (TTPi) model with some better strategies for such training. We hope this study can enhance troop training achievements and enhancing training efficiency.

Keywords: E-learning, centralization organization, training efficiency, web-base, education training

Title of the Paper: Multi-Dimensional Evaluation Model of Quality of Life in Campus


Authors: Daniel S. Rodrigues, Rui A. R. Ramos, Jose F. G. Mendes

Abstract: The quality of teaching and research activities in Universities are somehow related to the quality of the spaces where they take place, either when considering the buildings’ facilities, or when taking into account the campus landscape. Some authors have concluded that the students' perception of their overall academic experience and the campus environment is related to academic accomplishment and success. Furthermore, keeping and increasing the quality of life in public spaces are also recognised as critical aspects concerning the urban sustainable development perspective. In that context, and when analysing characteristics, form, dimension and organisation of university campi, it can be concluded that they can be seen as urban spaces. Also, this fact is often enhanced due to their location: in urban areas or even merged in the city. In this context, based on concepts for urban spaces, a multi-dimensional evaluation model of the quality of life in campus is presented in this paper. Its main purpose is to provide conceptual bases for the implementation of a decision support system that evaluates the global index of the Quality of Life in the Campus (QlC). The process integrates users’ perception and provides the ability to assess the impact of future interventions on the campus quality of life using scenarios. Those scenarios will express the changes in QlC through subsequent indicators values and global index update. The assessment of the QlC variation that would result from the scenario execution will serve as a decision support tool for campus management when studying several possibilities. The case study presented explores and shows the web based information system for monitoring the QlC of the Gualtar Campus of the University of Minho, located in Braga, Portugal. Basically, the model aims at determining a QlC variation and by comparing different moments in time. The system embodies two main functions related to its sustainable development process: (i) to inform the community, allowing any user to know which are the considered indicators and their actual values, and how the QlC has evolved; (ii) to serve as a decision support tool, mainly in facilities planning and management, allowing to compare the impact of several scenarios on several quality of life dimensions, through an evaluation that integrates the users’ perception.

Keywords: Quality of Life, Decision Support System, University Campus, Public Participation

Title of the Paper: An Application of Fuzzy-Evolutive Integral to Improve the Performances of Multispectral ATR Systems


Authors: Constantin-Iulian Vizitiu, Florin Serban, Teofil Oroian, Cristian Molder, Mihai Stanciu

Abstract: According to the literature assigned to automatic target recognition (ATR) system theory, one of the most important methods to improve the performances of these is to use multispectral information provided by a suitable set of sensors, and certainly, proper information fusion techniques. In this paper authors propose an application using an improved evolutive version of Sugeno’s fuzzy integral to increase the target recognition performances based on high-resolution radar (HRR) and video imaginary. In order to confirm the broached theoretical aspects, a real input database was used.

Keywords: ATR systems, HRR/video imaginary, decision fusion, neural networks

Title of the Paper: The 19th Century Official Paris Salon Exhibition Digital Museum


Authors: Tingsheng Weng

Abstract: This paper selects the paintings of the 19th Century Official Paris Salon Exhibition collection, housed in the Chi Mei Museum of Taiwan, as the subjects for photographing, scanning, text description, database design, and software programming in order to establish a digital archive of these precious paintings. This research applies the latest information technology to integrate digital archives, and systematically divide their contents into eight major sections, then upload them to the Internet to complete the entire project. Regarding education in particular, on-line learning, tests on paintings, and functional mechanisms of painting are provided for teachers, students, and the public to choose selected courses according to their personal interests, and experience the global-reach of on-line knowledge. By integrating digital archives, a digital museum, and digital learning, this research hopes to effectively utilize resources, enhance educational promotions, further the industry of cultural creativity, realize the goal of true international art, and offer some of Taiwan’s cultural resources to the art lovers of the global village. The digitalization of this art collection is beneficial to the Chi Mei Museum in terms of exhibition, publication, collection, teaching, and research, however, the benefits reach far beyond national borders, as people round the world are able to appreciate such cultural assets. In the future, image authorization services may be provided to a digital content industry, as raw digital materials can enhance the concept of value-added, inspire innovations for the promotion of education, and facilitate the development of a knowledge economy.

Keywords: Salon, e-books, e-Learning, On-line painting games, Testing, Digital Archives, Digital Museum

Title of the Paper: A Kind of Generalized Fuzzy C-Means Clustering Model and its Applications in Mining Steel Strip Flatness Signal


Authors: Tang Cheng-Long, Wang Shi-Gang, Xu Wei

Abstract: In this paper, the intelligent techniques are utilized to enhance the quality control precision in the steel strip cold rolling production. Firstly, a new control scheme is proposed, establishing the classifier of the steel strip flatness signal is the basis of the scheme. The fuzzy clustering method is used to establish the classifier. Getting the high quality clustering prototypes is one of the key tasks. Secondly, a kind of new fuzzy clustering model, generalized fuzzy C-means clustering (GeFCM) model, is proposed and used as the mining tools in the real applications. The results, under the comparisons with the results obtained by the basic fuzzy clustering model, show the GeFCM is robust and efficient and it not only can get much better clustering prototypes, which are used as the classifier, but also can easily and effectively mine the outliers. It is very helpful in the steel strip flatness quality control system in one real cold rolling line. Finally, it is pointed out that the new model’s efficiency is mainly due to the introduction of a set of adaptive degrees wj (j=1…n, and n is the number the data objects) and an adaptive exponent p which jointly affect the clustering operations. In nature, the proposed GeFCM model is the generalized version of the existing fuzzy clustering models.

Keywords: Steel strip flatness signal; Generalized fuzzy clustering; Outliers mining; Adaptive degrees; Adaptive exponent

Title of the Paper: Using Formal Concept Analysis to Leverage Ontology-Based Yoga Knowledge System


Authors: Y. C. Lin

Abstract: Over the past decade, the most significant evolution in organization might be the dawn of the new economy in connection with the value of intellectual assets. In this paper, we combine Formal Concept Analysis (FCA) with Protege in order to construct ontology-based Yoga method serving both as an example, and a knowledge-sharing platform. This may build a picture of experts’ knowledge and understand with the general public, allow them to participate in, and be totally aware of, satisfying their requirements. Moreover, the platform will foster to eliminate the phenomenon of information disparity between patients and physicians.

Keywords: Formal Concept Analysis, Yoga, Information Retrieval, Ontology, Ontology-based system

Title of the Paper: Development of an Ontology-based Flexible Clinical Pathway System


Authors: Y. C. Lin

Abstract: The efficient storage of medical knowledge is critical for the advancement of medicine; a flexible platform for the storage of knowledge is the need of the hour. Therefore, this work focuses on clinical pathways—tools that effectively maintain the quality and control the cost of medicine—to formulate a model for knowledge storage. In this work, ontology was employed as the fundamental theory to facilitate the construction of a flexible system. Using task and domain ontologies, clinical pathway knowledge becomes more accessible, as domain and operation knowledge are available separately. Moreover, taking into account the continued refinement of clinical pathways, this work developed cost-effective and quality health care systems, using quartile and variance computations to identify problematic treatments and pathways and to refine decision support systems. The integration of an ontological approach with quartile and variance algorithms for clinical pathways was implemented by system development, the outcome of which is presented in this paper.

Keywords: Ontology, Clinical pathways, Domain ontology, Task ontology, Ontology-based system


[Journals], [Books], [Conferences], [Research], [Contact us], [Indexing], [Reports], [E-Library], [FAQ],
[Upcoming Conferences with Expired Deadline], [History of the WSEAS Conferences]

Copyright © WSEAS