Leeds Beckett University - City Campus,
Woodhouse Lane,
LS1 3HE
Professor Ah-Lian Kor
Professor
Senior Lecturer in the School of Computing, Creative Technologies and Engineering and Course Leader for MSc Green Computing.
About
Senior Lecturer in the School of Computing, Creative Technologies and Engineering and Course Leader for MSc Green Computing.
Dr Ah-Lian Kor is part of Leeds Beckett MSc Sustainable Computing Curriculum Development Team. She has been involved in several EU projects for Green Computing, Innovative Training Model for Social Enterprises Professional Qualifications, and Integrated System for Learning and Education Services. She has published work on ontology, Semantics Web, Web Services, Portal and semantics for GIS. She is active in AI research and has developed an intelligent map understanding system and reasoning system.
She is an Editorial Board Member of the International Journal of Web Portals; a member of IQN (international quality network for spatial cognition); sits on many international conference program/technical committees (e.g. IEEE Cloud Computing, Cyberlaws Conference, ICT-EurAsia, etc.); an active paper reviewer for journals and conferences (e.g. AMCIS, International Journal of Emergency Services, Inderscience journal, etc.); Editorial Advisory Board Member for International Journal on Advances in Intelligent Systems and International Journal on Advances in Security; an associate member of the EPSRC funded e-GISE (e-Government and System Evaluation) Network) and has helped organise an international workshop (eGOV05) for the network.
Academic positions
Reader in Sustainable and Intelligent Computing
Leeds Beckett University, School of Computing, Creative Technologies and Engineering, Leeds, United Kingdom | 01 September 2018 - present
Degrees
PhD
University of Leeds, Leeds, United Kingdom | October 1996 - January 2001
Related links
Research interests
Dr Kor designed, implemented and evaluated interactive simulations, investigated reasoning and learning styles adopted by users when they interact with computer learning systems. She has modelled qualitative understanding based on semantic networks, causal maps, and also conceptual change based on analysis of search space (General Problem Solver Model). She has used prolog to develop a prototype which uses qualitative spatial reasoning techniques to identify road junctions: T, Y or V, and X in a digitised vector map; developed a knowledge base with reasoning capability for cardinal directions.
Dr Kor conducted statistical tests (e.g. Chi Square, Correlation, ANOVA, etc...) on collected survey data through CHALCS, a community project in Leeds. For the MOSE project, quantitative analyses (e.g. multiple regression, correlation, ANOVA, and t-test) have been conducted on data collected through online questionnaires. She has guided students in the development of expert systems for a local power station and intelligent control as well as monitoring system for a data centre.
Theoretical ResearchDr Kor co-developed the Horizontal and Vertical Constraints Model and an Expressive Hybrid Model for the reasoning of cardinal direction relations between regions. This entailed formalising definitions for the following: atomic binary cardinal relations, whole and part cardinal relations, weak and expressive relations. Dr Kor introduced a formula which could compute the composition of cardinal direction relations for whole or part regions. Composition tables are used as part of the inference engine in the reasoning systems.
Applied information systems development life cycle, knowledge management life cycle, organisational learning, and mapping techniques (e.g. conversation, UML and states, and causal mapping) to develop an integrated framework (CARE: Continuous Adaptable Reflective Evaluation) for evaluating e-government systems (Systems Evaluation Life Cycle, SELC; Reflective Evaluative Methodology, REM; Integrated methodology (SSM, HSM, and Research Methodologies).
Based on the action research cycle and innovation theories, a conceptual action-oriented Innovation Model is built. It aims to be used as a guide for the implementation of an innovation process that could bring about transformation and also for the evaluation of an organisation's innovation activeness. A novel PPT (People, Process, Technology) Innovation Value Model is built to guide the evaluation of innovation value in the public sector. Dr Kor developed a calculi for the conversion of information to knowledge (and vice versa), and constructed a novel Information and Knowledge Management System which aims to unify the various epistemologies and contemporary models for knowledge.
Publications (145)
Sort By:
Featured First:
Search:
A Survey of Epistemology and its Implications on an Organisational Information and Knowledge Management Model
This is a theoretical chapter which aims to integrate various epistemologies from the philosophical, knowledge management, cognitive science, and educational perspectives. From a survey of knowledge-related literature, this chapter collates diverse views of knowledge. This is followed by categorising as well as ascribing attributes (effability, codifiability, perceptual/conceptual, social/personal) to the different types of knowledge. The authors develop a novel Organisational Information and Knowledge Management Model which seeks to clarify the distinctions between information and knowledge by introducing novel information and knowledge conversions (information-nothing, information-information, information-knowledge, knowledge-information, knowledge-knowledge) and providing mechanisms for individual knowledge creation and information sharing (between individual-individual, individual-group, group-group) as well as Communities of Practice within an organisation.
Emotion recognition in conversational contexts is vital to affective computing, with applications in human-computer interaction (HCI), mental health diagnostics, and social robotics. However, existing methods struggle with latent emotional factors disentanglement, dynamic causal dependencies adaptation, and coherent multimodal data integration. This paper proposes a unified framework that addresses these limitations through novel causal modeling and multimodal data fusion techniques. We introduce the Multimodal Emotion-aware Causal Model (MECM), which employs variational autoencoders ((VAE) to disentangle discrete emotion categories from continuous intensity levels, and integrates a temporal causal gating mechanism to dynamically modulate historical influences. Building on MECM, the Hierarchical Emotion-Causal Graph Model (HE-CGM) captures multi-scale causal interactions using structured causal matrices for global dependencies and recurrent modules for local transitions. To resolve multimodal inconsistencies, we design a cross-modal fusion strategy that combines attention mechanisms with Kullback-Leibler (KL) divergence regularization, enabling coherent integration of text, audio, and visual modalities. Empirical results demonstrate that HE-CGM outperforms the state-of-the-art models in explicit and implicit emotion cause extraction, achieving F1-scores of 79.83 and 96.31 respectively. Our work underscores the value of causal reasoning in multimodal learning and highlights the importance of datasets that capture intra-cultural linguistic diversity in Chinese contexts for developing effective AI applications.
Semantic Web, RDF, and Portals
In existing literature, Semantic Web portals (SWPs) are sometimes known as semantic portals or semantically enhanced portals. It is the next generation Web portal which publishes contents and information readable both by machines and humans. A SWP has all the generic functionalities of a Web portal but is developed using semantic Web technologies. However, it has several enhanced capabilities such as semantics- based search, browse, navigation, automation processes, extraction, and integration of information (Lausen, Stollberg, Hernandez, Ding, Han & Fensel, 2004; Perry & Stiles, 2004). To date the only available resources on SWPs are isolated published Web resources and research or working papers. There is a need to pool these resources together in a coherent way so as to provide the readers a comprehensive idea of what SWPs are, and how they could be built, and these will be supported by some appropriate examples. Additionally, this article will provide useful Web links for more extensive as well as intensive reading on the subject.
Ontology, Web Services, and Semantic Web Portals
In the article, entitled “Semantic Web, RDF, and Portals”, it is mentioned that a Semantic Web Portal (SWP) has the generic features of a Web portal but is built on semantic Web technologies. This article provides an introduction to two types of Web ontology languages (RDF Schema and OWL), semantic query, Web services, and the architecture of a Semantic Web Portal.
Support For Students With Learning Disabilities In Higher Education
Counter-examples used in a Socratic dialogue aim to provoke reflection to effect conceptual changes. However, natural language forms of Socratic dialogues have their limitations. To address this problem, we propose an alternative form of Socratic dialogue called the pictorial Socratic dialogue. A Spring Balance System has been designed to provide a platform for the investigation of the effects of this pedagogy on conceptual changes. This system allows learners to run and observe an experiment. Qualitative Cartesian graphs are employed for learners to represent their solutions. Indirect and intelligent feedback is prescribed through two approaches in the pictorial Socratic dialogue which aim to provoke learners probe through the perceptual structural features of the problem and solution, into the deeper level of the simulation where Archimedes’ Principle governs.
A Summary on Skills Analysis and Skills-related Action Plans in the UK
New media Strategies of Higher Education Institutions and their Implementation – National and International Examples: Country Report of the United Kingdom (
In the article, entitled “Semantic Web, RDF, and Portals”, it is mentioned that a Semantic Web Portal (SWP) has the generic features of a Web portal but is built on semantic Web technologies. This article provides an introduction to two types of Web ontology languages (RDF Schema and OWL), semantic query, Web services, and the architecture of a Semantic Web Portal.
This is an eGISE network article which aims to justify the need for a holistic approach (the CARE framework) to the evaluation of e-government projects and to outline a programme of research for its delivery. It is argued that existing methods of evaluation are too limited in terms of scope, perspective, and application and do not offer the necessary potential for learning in an environment characterised by enormous change and considerable investment. Developing the CARE framework addresses these limitations by providing a method of collaborative inquiry which involves relevant stakeholders in the appraisal of both ‘hard’ and ‘soft’ aspects of an e-government system. It is also intended that the research project also produces supporting software tools for the framework. The proposed project is based on previous work in the construction industry that developed a cross organisational learning approach (COLA). Developing a similar strategy for Knowledge Management is likely to be effective because the ‘silo’ culture of local government organisations has parallels with the segmented organisational structures within the construction industry.
Our previous work focuses on how the nine tiles in the 2-D projection-based model for cardinal directions can be partitioned into sets based on horizontal and vertical constraints (called Horizontal and Vertical Constraints Model). In this paper, the 2-D Horizontal and Vertical Constraints model is adapted and extended into a 3-D Horizontal and Vertical Constraints Block model so that it facilitates easy reasoning with 3-D volumetric regions (i.e. without holes and single-pieced) in the real physical world (e.g. intelligent robotics, building construction, etc…). This model partitions a 3-D Euclidean space of a 3-D reference region into 9 blocks, namely, left, middlex, right, above, middley, below, left, middlez, right. The additional central block (or the Minimum Bounding Box of the 3-D reference region) is an intersection of the three blocks, namely, middlex, middley, and middlez. The added value of the 3-D Horizontal and Vertical Constraints Block model is the use of intuitive (i.e. commonsense) knowledge representation for 3-D orientation relations. However, the underlying formal representation of the model is facilitated through the use of the 3-D Cartesian Coordinate system, first order logic, and boolean algebraic expressions. The novel contribution of this research work is fostering reasoning with partial orientation relation related knowledge (note: these are called weak relations) and also integrating mereology into the 3-D model in order to render the representation of the model more expressive. Finally, composition of relations is the technique employed in this research to general new knowledge. Mereology is integrated into the model in order to render the model more expressively. Finally, several examples will demonstrate how the model could be used to make inferences about 3-D orientation relations.
Audit of an Organisation’s ICT Systems for Flexible Working
This research entails an audit of the ICT systems within an organisation to determine the environmental impact of flexible working on the organisation’s carbon footprint. The study reviews current issues and methodologies in the green ICT sector before providing an overview of the research process. Questionnaires and observations are employed for the investigation on employee working habits. A number of energy consumption measuring tools such as Joulemeter, Powermeter, and SusteIT are used to audit energy consumption of laptops, monitors and phones used by the organisation. This research reveals that working from home has a lower carbon footprint than working in the office primarily due to commuting-related energy consumption. Approximately 20% of the organisation’s staff work from home. The organisation’s annual carbon footprint is 31,509kg of CO2 emissions taking into consideration IT equipment and travel-related emissions. The recommendation is to allow more staff to work from home with given guidelines on the responsible handling of IT equipment in order to reduce their energy consumption. It is recommended that further study be undertaken in order to gain a detailed carbon footprint report.
The continuous improvement in energy efficiency of existing data centers would help reduce their environmental footprints. Greening of Data Centers could be attained using renewable energy sources or more energy efficient compute systems and effective cooling systems. A reliable cooling system is necessary to generate a persistent flow of cold air to cool servers that are subjected to increasing computational load demand. As a matter of fact, servers' dissipated heat effects a strain on the cooling systems and consequently, on electricity consumption. Generated heat in the data center is categorized into different granularity levels namely: server level, rack level, room level, and data center level. Several datasets are collected at ENEA Portici Data Center from CRESCO 6 cluster-A High-Performance Computing Cluster. The cooling and environmental aspects of the data center is also considered for data analysis. This research aims to conduct a rigorous exploratory data analysis on each dataset separately and collectively followed in various stages. This work presents descriptive and inferential analyses for feature selection and extraction process. Furthermore, a supervised Machine learning modelling and correlation estimation is performed on all the datasets to abstract relevant features.That would have an impact on energy efficiency in data centers.
Conceptual design represents a critical initial design stage that involves both technical and creative thinking to develop and derive concept solutions to meet design requirements. TRIZ Scientific Effects (TRIZSE) is one of the TRIZ tools that utilize a database on functional, transformation, parameterization of scientific effects to provide conceptual solutions to engineering and design problems. Although TRIZSE has been introduced to help engineers solve design problems in the conceptual design phase, the current TRIZSE database presents general scientific concept solutions with a few examples of solutions from patents which are very abstract and not updated since its introduction. This research work explores the derivation of a novel framework that integrates TRIZ scientific effects to the current patent information (USPTO) using data mining techniques to develop a better design support tool to assist engineers in deriving innovative design concept solutions. This novel framework will provide better, updated, relevant and specific examples of conceptual design ideas from patents to engineers. The research used Python as the base programming platform to develop a conceptual design software prototype based on this new framework where both the TRIZSE Database and Patents Database (USPTO) are searched and processed in order to build a Doc2Vec similarity model. A case study on the corrosion of copper pipelines by seawater is presented to validate this novel framework and results of the novel TRIZSE Database and patents examples are presented and further discussed in this paper. The results of the case study indicated that the Doc2Vec model is able to perform its intended similarity queries. The patent examples from results of the case study warrant further consideration in conceptual design activities.
Due to the alarming rate of climate change, fuel consumption and emission estimates are critical in determining the effects of materials and stringent emission control strategies. In this research, an analytical and predictive study has been conducted using the Government of Canada dataset, containing 4973 light-duty vehicles observed from 2017 to 2021, delivering a comparative view of different brands and vehicle models by their fuel consumption and carbon dioxide emissions. Based on the findings of the statistical data analysis, this study makes evidence-based recommendations to both vehicle users and producers to reduce their environmental impacts. Additionally, Convolutional Neural Networks (CNN) and various regression models have been built to estimate fuel consumption and carbon dioxide emissions for future vehicle designs. This study reveals that the Univariate Polynomial Regression model is the best model for predictions from one vehicle feature input, with up to 98.6% accuracy. Multiple Linear Regression and Multivariate Polynomial Regression are good models for predictions from multiple vehicle feature inputs, with approximately 75% accuracy. Convolutional Neural Network is also a promising method for prediction because of its stable and high accuracy of around 70%. The results contribute to the quantifying process of energy cost and air pollution caused by transportation, followed by proposing relevant recommendations for both vehicle users and producers. Future research should aim towards developing higher performance models and larger datasets for building APIs and applications.
Sustainable issues have become more serious due to the rapid development of the global economy. Sustainable design is an approach for designing or creating sustainable products/solutions based on sustainable development principles. Patent documents contain a lot of useful inventive information which will be useful for sustainable product design. However, they are dense and lengthy due to excessive overladen technical terminology. Automatic text mining tool in patent analysis is therefore, in great demand to assist innovators or patent engineers in their patent search. The main focus of this work is to develop a patent mining prototype to extract sustainable design information from the patent database and recommend potential solutions to the user by using Patent Mining and TRIZ. The TRIZ problem solving process and details of patent mining will be described in this paper. Patent mining techniques include tokenization, stop words filtering, stemming, lemmatization and classification. The patent mining techniques were implemented together with relevant sustainable design indicators to identify patent documents that contained the most relevant sustainable design solution or suggestions. A sustainable design problem is illustrated in this paper to demonstrate how a TRIZ user can utilize the implemented patent mining techniques and sustainable design indicators to obtain a sustainable solution for a design problem.
Evaluation of the Performance of Machine Learning and Deep Learning Techniques for Predicting Rainfall: An Illustrative Case Study from Australia
Rainfall is a major factor in our ecological and environmental balance for a variety of reasons, including economy, agriculture, and cleanliness. It supplies the planet with essential fresh water, especially in areas where groundwater resources are scarce. Hence, a dependable prediction model for rainfall is essential, as it can help predict flooding and monitor pollutant levels. Historically, weather predictions were made using meteorological satellites. But now, with advancements in technology and data analysis, machine learning has been utilized in weather forecasting. However, accurately predicting rainfall remains a complex task and existing methods depend on complex models that may incur high costs due to their extensive computational requirements. This research assesses the effectiveness of both conventional machine learning algorithms and deep learning techniques as potential options, by conducting a comprehensive comparison using a uniform case study that analyzed ten years of rainfall data collected from various regions in Australia. Through the comparisons and evaluations, we aim at finding the most feasible method for the detection of weather patterns. The models' performance is measured using metrics such as loss, Mean Absolute Error, Mean Squared Error and Mean Squared Logarithmic Error. The results show that the proposed CNN model is the most accurate among all the models.
It is widely accepted that human activities largely contribute to global emissions and thus, greatly impact climate change. Awareness promotion and adoption of green transportation mode could make a difference in the long term. To achieve behavioural change, we investigate the use of a persuasive game utilising online transportation mode recognition to afford bonuses and penalties to users based on their daily choices of transportation mode. To facilitate an easy identification of transportation mode, classification predictive models are built based on accelerometer and gyroscope historical data. Preliminary results show that the classification true-positive rate for recognising 10 different transportation classes can reach up to 95% when using a historical set (66% without). Results also reveal that the random tree classification model is a viable choice compared to random forest in terms of sustainability. Qualitative studies of the trained classifiers and measurements of Android-device gravity also raise several issues that could be addressed in future work. This research work could be enhanced through acceleration normalisation to improve device and user ambiguity.
An algorithmic approach for the assessment of the survivability is proposed that is based on Lanchester’s modified deterministic model. Methods are suggested for increasing the available time capability for nuclear power plant monitoring and coverage, using the required or a limited number of the operable drones,. Dependencies of the variance between the residual fleet damage and permissible drone fleet damage on monitoring time as well as dependencies of the monitoring time on the recovery group productivity are analysed.
Immersive Virtual Reality (VR) technology is increasingly used in education. However, research on its impact and development in education, specifically from the perspective of telepresence, emotion, and cognition, remains limited. Therefore, this study aims to reveal the effects of three dimensions and their interactions in education by addressing proposed six research questions. Specifically, it explores the impact of VR technology on learning experiences and its future development through experiments conducted across seven subjects: History, Bioengineering, English, Chinese Language and Literature, Music, Art Design, and Physical Education. Our findings reveal that emotional engagement plays a pivotal role in driving cognitive engagement within VR environments, challenging traditional perspectives on negative emotions by demonstrating their potential as motivators. We recommend enhancing emotional design elements by strategically invoking certain negative emotions to enrich the emotional challenge in VR environment. Furthermore, the study underscores the importance of leveraging visual preferences, integrating suitable auditory stimuli, and utilizing telepresence characteristics to develop immersive VR environments that foster a sense of agency, autonomy, and critical thinking among learners. This research provides critical insights to guide the future development of VR technology in education, contributing to global learning, addressing widespread educational inequalities, and making a positive impact on society.
Generative Adversarial Networks in Combustion: Flame Image Generation for Clean and Predictive Combustion Modeling using Real Data
This study explores the application of Generative Adver sarial Networks (GANs) in combustion science, utilizing a flame image dataset. By comparing the produced images with the original dataset, we qualitatively analyze the discriminator and generator loss to assess the performance of the GAN. The results show improvements in the dis criminator’s ability to distinguish between real and generated images, as well as improvements in the generator’s ability to add missing details, leading to the generation of images that are more realistic. Fundamental properties of flames are well captured in the resulting images, despite the absence or distortion of minor details. The study advances the fields of artificial intelligence, image processing, and combustion science by highlighting possible uses in the creation of synthetic images and data augmentation. Overall, our qualitative analysis enriches comprehension of combustion science.
The domain of Multi-Network Latency Prediction for IoT and Wireless Sensor Networks (WSNs) confronts significant challenges. However, continuous research efforts and progress in areas such as machine learning, edge computing, security technologies, and hybrid modelling are actively influencing the closure of identified gaps. Effectively addressing the inherent complexities in this field will play a crucial role in unlocking the full potential of latency prediction systems within the dynamic and diverse landscape of the Internet of Things (IoT). Using linear interpolation and extrapolation algorithms, the study explores the use of multi-network real-time end-to-end latency data for precise prediction. This approach has significantly improved network performance through throughput and response time optimization. The findings indicate prediction accuracy, with the majority of experimental connection pairs achieving over 95% accuracy, and within a 70% to 95% accuracy range. This research provides tangible evidence that data packet and end-to-end latency time predictions for heterogeneous low-rate and low-power WSNs, facilitated by a localized database, can substantially enhance network performance, and minimize latency. Our proposed JosNet model simplifies and streamlines WSN prediction by employing linear interpolation and extrapolation techniques. The research findings also underscore the potential of this approach to revolutionize the management and control of data packets in WSNs, paving the way for more efficient and responsive wireless sensor networks.
This research entails an audit of the ICT systems within an organisation to determine the environmental impact of flexible working on the organisation’s carbon footprint. The study reviews current issues and methodologies in the green ICT sector before providing an overview of the research process. Questionnaires and observations are employed for the investigation on employee working habits. A number of energy consumption measuring tools such as Joulemeter, Powermeter, and SusteIT are used to audit energy consumption of laptops, monitors and phones used by the organisation. This research reveals that working from home has a lower carbon footprint than working in the office primarily due to commuting-related energy consumption. Approximately 20% of the organisation’s staff work from home. The organisation’s annual carbon footprint is 31,509kg of CO2 emissions taking into consideration IT equipment and travel-related emissions. The recommendation is to allow more staff to work from home with given guidelines on the responsible handling of IT equipment in order to reduce their energy consumption. It is recommended that further study be undertaken in order to gain a detailed carbon footprint report.
This book chapter is adapted from [1] and it is closely linked to work published in [2] and [3]. Reducing power consumption of network equipment has been both driven by a need to reduce the ecological footprint of the cloud as well as the im-mense power costs of data centers. As data centers, core networks and conse-quently, the cloud, constantly increase in size, their power consumption should be mitigated. Ethernet, the most widely used access network still remains the biggest communication technology used in core networks and cloud infrastructures. The Energy-Efficient Ethernet or EEE standard introduced by IEEE in 2010, aims to reduce the power consumption of EEE ports by transitioning Ethernet ports into a low power mode when traffic is not present. As statistics show that the average utilization rate of ethernet links is 5 percent on desktops and 30 percent in data centers, the power saving potential of EEE could be immense. This research aims to assess the benefits of deploying EEE and create a power consumption model for network switches with and without EEE. Our measurements show that an EEE port runs at 12-15% of its total power when in low power mode. Therefore, the power savings can exceed 80% when there is no traffic. However, our measure-ments equally show that the power consumption of a single port represents less than 1% of the total power consumption of the switch. The base power consumed by the switch without any port is still significantly high and is not affected by EEE. Experiment results also show that the base power consumption of switches does not significantly increase with the size of the switches. Doubling the size of the switch between 24 and 48 ports increases power consumption by 35.39%. EEE has a greater effect on bigger switches, with a power (or energy) gain on the EEE-enabled 48-port switch compared to 2 x EEE-enabled 24-port switch. On the other hand, it seems to be more energy efficient to use 2 separate 24-port switches (NO EEE) than 2 separate 24-port switches (With EEE).
This work analyzes a very subtle kind of energy metrics for Data Centers (DCs), namely productivity metrics which affect the global energy efficiency assessment in DC since they focus on the energy used for processing computing operations. By exploiting the available set of energy consumption data of operating systems in ENEA-DC, HPC-Cluster, the authors evaluated the energy consumed by different queues with several running applications. The queues energy waste has been calculated to provide an assessment for the ineffective use of computation-related energy load within the Cluster. This work shows an increment innovation beyond state-of-the-art for productivity metrics (e.g. useful work), and it will also help provide an invaluable insight into useful energy use and the use of enhanced sustainability metrics with the goal of driving a more sustainable DC. Additionally, sustainability concept in DC operations is driven by estimation of its indirect carbon emissions, which is shown in this work.
The study and analysis of energy efficiency in Data Centers (DCs), through a set of globally accepted metrics, is an ongoing challenge. In particular, the area of productivity metrics is not completely explored, and there is no existing proposed metrics, which provides a direct measurement of the useful work in a DC. This paper proposes a methodology that addresses the problem of measurement, calculating, and evaluating the energy productivity assessment in Data Center (DC), which encompasses both the portion of energy employed for computing and energy wasted during computational work. It involves the estimation of productive energy consumption by a DC cluster based on the following: statistical data collection and interpretation, software for energy data analysis, and mathematical formulation. This current work is based on available data extracted through experiments conducted on the cluster “CRESCO4” from ENEA Data Center facilities. The dataset covers the power and job schedule characteristics running on the cluster for one year. This paper shows how to advance beyond state of the art for productivity metrics (e.g. useful work). It will also help enhance server performance and power management since the appropriate statistical data analysis provides a profile on server energy consumption behavior. Additionally, we make recommendations on how the productivity assessment could driver a new power efficiency management strategy, which is specifically targeted at DC manager and/or operators, and end-users of the facilities.
Our previous work focuses on how the nine tiles in the 2-D projection-based model for cardinal directions can be partitioned into sets based on horizontal and vertical constraints (called Horizontal and Vertical Constraints Model). In this paper, the 2-D Horizontal and Vertical Constraints model is adapted and extended into a 3-D Horizontal and Vertical Constraints Block model so that it facilitates easy reasoning with 3-D volumetric regions (i.e. without holes and single-pieced) in the real physical world (e.g. intelligent robotics, building construction, etc…). This model partitions a 3-D Euclidean space of a 3-D reference region into 9 blocks namely: left, middlex, right, above, middley, below, left, middlez, right. The additional central block (or the Minimum Bounding Box of the 3-D reference region) is an intersection of the three blocks namely: middlex, middley, and middlez. The added value of the 3-D Horizontal and Vertical Constraints Block model is the use of intuitive (i.e commonsense) knowledge representation for 3-D orientation relations. However, the underlying formal representation of the model is facilitated through the use of the 3-D Cartesian Coordinate system, first order logic, and boolean algebraic expressions. The novel contribution of this research work is fostering reasoning with partial orientation relation related knowledge (note: these are called weak relations) and also integrating mereology into the 3-D model in order to render the representation of the model more expressive. Finally, composition of relations is the technique employed in this research to general new knowledge. Mereology is integrated into the model in order to render the model more expressive. Finally, several examples will demonstrate how the model could be used to make inferences about 3-D orientation relations.
Defending IT systems against intelligent malware
© 2018 IEEE. The increasing amount of malware variants seen in the wild is causing problems for Antivirus Software vendors, unable to keep up by creating signatures for each. The methods used to develop a signature, static and dynamic analysis, have various limitations. Machine learning has been used by Antivirus vendors to detect malware based on the information gathered from the analysis process. However, adversarial examples can cause machine learning algorithms to miss-classify new data. In this paper we describe a method for malware analysis by converting malware binaries to images and then preparing those images for training within a Generative Adversarial Network. These unsupervised deep neural networks are not susceptible to adversarial examples. The conversion to images from malware binaries should be faster than using dynamic analysis and it would still be possible to link malware families together. Using the Generative Adversarial Network, malware detection could be much more effective and reliable.
Modern smartphones have become indispensable for many people around the world as they continue to evolve and introduce newer functions and operations. Battery capacity has however failed to keep up with the rate at which smartphones have evolved in recent years, which has led to rapid battery drain and the need for users to discard and replace them very frequently. This inevitably leads to increased greenhouse gas emissions and harmful consequences the world over due to poor disposal and reuse practices among users. Using the Samsung Galaxy Note as an android platform for experimentation, the factors most responsible for energy consumption and battery drain in smartphones are identified as the network, the device specifications, the applications on the device, and the common practices by the smartphone user. Interviews conducted with varied respondents further reveal that user practices impact energy consumption in smartphones more significantly than perhaps all the other factors. It is recommended that information be better conveyed to smartphone owners, while smartphone manufacturers should improve their design specifications in keeping with the Green Code. Further study is also suggested to distinctly clarify the impact of the stated factors on smartphone battery drain.
Welcome message from the chairs of IEEE ON-MOVE 2018, introducing the 12th edition of the workshop focused on vehicular and user mobility networks, held in Chicago, USA, and highlighting its role as a global forum for researchers and experts in mobile communications and vehicular networking.
This research aims to assess and evaluate the impact on sustainability in buildings through implementation of ICT Smart Systems. The setting for this research will be for a large global organisation’s headquarters in Germany. The list of objectives is: to audit the ICT infrastructure used and to survey the existing smart systems implemented; to investigate the total energy expenditure and carbon footprint for ICT equipment during a yearly period; and to explore how to best transfer best green ICT practices to other buildings. Based on the findings in this paper, investing in energy-saving ICT equipment, or even a BMS, can be very cost beneficial to a company and reduce the carbon footprint of commercial buildings when implemented correctly.
The prevalent demand for remote data sharing and connectivity has catalysed the development of many wireless network technologies. However, low-power and low-rate wireless network technologies have emerged as the preferred choice (due to cheap procurement and maintenance cost, efficiency, and adaptability). Currently, these groups of wireless networks are adopted in homes, health, and business sectors. The increase in existing WSNs has resulted in the incompatibility of wireless network protocols and poses a problem that results in high acquisition or maintenance costs, increased complexity, reliability inadequacies in some instances, lack of uniformity within similar standards, and high energy consumption. To address this problem, we develop a novel machine-to-machine software-based brokerage application (known as JosNet) for interoperability and integration between Bluetooth LE, Zigbee, and Thread wireless network technologies. JosNet allows one network protocol to exchange data packets or commands with each other. In this paper, we present a novel working network brokerage model for a one-to-one network protocol to communication (e.g., from Zigbee to Bluetooth) or one-to-many network protocol communication (e.g., from Bluetooth to Zigbee, Thread, etc.) to securely send messages in a large-scale routing process for short or long-range connections. We also present a large-scale implementation of JosNet using a routing table for large areas. The results show an industry standard performance for end-to-end latency time and throughput.
Nature-Inspired Optimization (NIO) algorithms have become prevalent to address a variety of optimization problems in real-world applications because of their simplicity, flexibility, and effectiveness. Some application areas of NIO algorithms are telecommunications, image processing, engineering design, vehicle routing, etc. This study presents a critical analysis of energy consumption and their corresponding carbon footprint for four popular NIO algorithms. Microsoft Joulemeter is employed for measuring the energy consumption during the runtime of each algorithm, while the corresponding carbon footprint of each algorithm is calculated based on the UK DEFRA guide. The results of this study evidence that each algorithm demonstrates different energy consumption behaviors to achieve the same goal. In addition, a one-way Analysis of Variance (ANOVA) test is conducted, which shows that the average energy consumption of each algorithm is significantly different from each other. This study will help guide software engineers and practitioners in their selection of an energy-efficient NIO algorithm. As for future work, more NIO algorithms and their variants can be considered for energy consumption analysis to identify the greenest NIO algorithms amongst them all. In addition, future work can also be considered to ascertain possible relationships between NIO algorithms and the energy usage of hardware resources of different CPU architectures.
Data Mining for Big Dataset-Related Thermal Analysis of High Performance Computing (HPC) Data Center
Greening of Data Centers could be achieved through energy savings in two significant areas, namely: compute systems and cooling systems. A reliable cooling system is necessary to produce a persistent flow of cold air to cool the servers due to increasing computational load demand. Servers’ dissipated heat effects a strain on the cooling systems. Consequently, it is necessary to identify hotspots that frequently occur in the server zones. This is facilitated through the application of data mining techniques to an available big dataset for thermal characteristics of High-Performance Computing ENEA Data Center, namely Cresco 6. This work presents an algorithm that clusters hotspots with the goal of reducing a data centre’s large thermal-gradient due to uneven distribution of server dissipated waste heat followed by increasing cooling effectiveness.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. Internet of Things (IoT) coupled with big data analytics is emerging as the core of smart and sustainable systems which bolsters economic, environmental and social sustainability. Cloud-based data centers provide high performance computing power to analyze voluminous IoT data to provide invaluable insights to support decision making. However, multifarious servers in data centers appear to be the black hole of superfluous energy consumption that contributes to 23% of the global carbon dioxide (CO2) emissions in ICT (Information and Communication Technology) industry. IoT-related energy research focuses on low-power sensors and enhanced machine-to-machine communication performance. To date, cloud-based data centers still face energy-related challenges which are detrimental to the environment. Virtual machine (VM) consolidation is a well-known approach to affect energy-efficient cloud infrastructures. Although several research works demonstrate positive results for VM consolidation in simulated environments, there is a gap for investigations on real, physical cloud infrastructure for big data workloads. This research work addresses the gap of conducting real physical cloud infrastructure-based experiments. The primary goal of setting up a real physical cloud infrastructure is for the evaluation of dynamic VM consolidation approaches which include integrated algorithms from existing relevant research. An open source VM consolidation framework, Openstack NEAT is adopted and experiments are conducted on a Multi-node Openstack Cloud with Apache Spark as the big data platform. Open sourced Openstack has been deployed because it enables rapid innovation, and boosts scalability as well as resource utilization. Additionally, this research work investigates the performance based on service level agreement (SLA) metrics and energy usage of compute hosts. Relevant results concerning the best performing combination of algorithms are presented and discussed.
Data centers aim at provisioning on-demand processing, storage and networking capabilities in a reliable and scalable way. In this context, proper maintenance of IT equipment within DC premises is crucial as it ensures prolonged lifetime of servers and uninterrupted availability of resources. DC management teams’ sustainable operation effort comprises various approaches to directly and indirectly reduce DC energy consumption. Thermal management aims to reduce excess energy consumption by air cooling and compute systems. This paper focuses on the analysis of the exact temperatures in a real DC cluster rather than considering device setpoints or guidelines. An extensive statistical analysis of available thermal data collected by server-level sensors, global and local thermal metrics evaluation is conducted. It enables isolating possible risks engendered by potential negative covert cooling-related factors. The ultimate outcome of this research is to bring about improvement of DC thermal management for sustainable operations.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). The energy efficiency of Data Center (DC) operations heavily relies on a DC ambient temperature as well as its IT and cooling systems performance. A reliable and efficient cooling system is necessary to produce a persistent flow of cold air to cool servers that are subjected to constantly increasing computational load due to the advent of smart cloud-based applications. Consequently, the increased demand for computing power will inadvertently increase server waste heat creation in data centers. To improve a DC thermal profile which could undeniably influence energy efficiency and reliability of IT equipment, it is imperative to explore the thermal characteristics analysis of an IT room. This work encompasses the employment of an unsupervised machine learning technique for uncovering weaknesses of a DC cooling system based on real DC monitoring thermal data. The findings of the analysis result in the identification of areas for thermal management and cooling improvement that further feeds into DC recommendations. With the aim to identify overheated zones in a DC IT room and corresponding servers, we applied analyzed thermal characteristics of the IT room. Experimental dataset includes measurements of ambient air temperature in the hot aisle of the IT room in ENEA Portici research center hosting the CRESCO6 computing cluster. We use machine learning clustering techniques to identify overheated locations and categorize computing nodes based on surrounding air temperature ranges abstracted from the data. This work employs the principles and approaches replicable for the analysis of thermal characteristics of any DC, thereby fostering transferability. This paper demonstrates how best practices and guidelines could be applied for thermal analysis and profiling of a commercial DC based on real thermal monitoring data.
Greening of Data Centers could be achieved through energy savings in two major areas namely: compute systems and cooling systems. A reliable cooling system is necessary to produce a persistent flow of cold air to cool the servers due to increasingly demanding computational load. Servers’ dissipated heat effects a strain on the cooling systems. Consequently, it is imperative to individual servers that frequently occur in the hotspot zones. This is facilitated through the application of data mining techniques to an available big data set with thermal characteristics of HPC-ENEA-Data Center, namely Cresco 6. This work involves the implementation of an advanced algorithm on the workload management platform produces hotspots maps with the goal to reduce data centre wide thermal-gradient, and cooling effectiveness.
Big Data applications have become increasingly popular with the emergence of cloud computing and the explosion of artificial intelligence. The increasing adoption of data-intensive machines and services is driving the need for more power to keep the data centers of the world running. It has become crucial for large IT companies to monitor the energy efficiency of their data-center facilities and to take actions on the optimization of these heavy electricity consumers. This paper proposes a Belief Rule-Based Expert System (BRBES)-based predictive model to predict the Power Usage Effectiveness (PUE) of a data center. The uniqueness of this model consists of the integration of a novel learning mechanism consisting of parameter and structure optimization by using BRBES-based adaptive Differential Evolution (BRBaDE), significantly improving the accuracy of PUE prediction. This model has been evaluated by using real-world data collected from a Facebook data center located in Luleå, Sweden. In addition, to prove the robustness of the predictive model, it has been compared with other machine learning techniques, such as an Artificial Neural Network (ANN) and an Adaptive Neuro Fuzzy Inference System (ANFIS), where it showed a better result. Further, due to the flexibility of the BRBES-based predictive model, it can be used to capture the nonlinear dependencies of many variables of a data center, allowing the prediction of PUE with much accuracy. Consequently, this plays an important role to make data centers more energy-efficient.
In the rapidly developing field of Human-Robot Interaction (HRI), simulating human emotional states poses significant challenges due to the inherent complexity and unpredictability of human emotions. Addressing the limitations in artificial emotion simulation, such as fuzzy theory, memory mechanism, and etc., we explore the genetic canvas that portrays emotions as an interplay of myriad complex expressions. By simulating emotional states using Genetic Hybridization Technology (GHT) in the Emotional State Transition (EST) model, this study investigates the role of genetics in artificial emotion simulation, outlines the creation of EST morphological genes, and validates their consistency. The results indicate that the Fréchet distance for EST curves ranges between 0.072 and 0.239, suggesting a high level of consistency between the experimentally generated EST curves and the newly generated EST morphological genes. This finding demonstrates the effectiveness of our proposed method and supports its future use in experimental design under various conditions. Additionally, we identified instances of gene mutations that occurred during the gene hybridization process, as highlighted in the results for EST curve (h). Despite this variation, the Fréchet distance remains within a reasonable range, further validating the reliability of our methodology. This study establishes a precedent for the methodology of emotional simulation, providing new research pathways for enriching HRI, through substantive exploration of the relationship between Artificial emotional intelligence (AEI) and GHT.
High-performance computing (HPC) in data centers increases energy use and operational costs. Therefore, it is necessary to efficiently manage resources for the sustainability of and reduction in the carbon footprint. This research analyzes and optimizes ENEA HPC data centers, particularly the CRESCO6 cluster. The study starts by gathering and cleaning extensive datasets consisting of job schedules, environmental conditions, cooling systems, and sensors. Descriptive statistics accompanied with visualizations provide deep insight into collated data. Inferential statistics are then used to investigate relationships between various operational variables. Finally, machine learning models predict the average hot-aisle temperature based on cooling parameters, which can be used to determine optimal cooling settings. Furthermore, idle periods for computing nodes are analyzed to estimate wasted energy, as well as for evaluating the effect that idle node shutdown will have on the thermal characteristics of the data center under consideration. It closes with a discussion on how statistical and machine learning techniques can improve operations in a data center by focusing on important variables that determine consumption patterns.
The deployment of autonomous vehicles has the potential to significantly lessen the variety of current harmful externalities, (such as accidents, traffic congestion, security, and environmental degradation), making autonomous vehicles an emerging topic of research. In this paper, a literature review of autonomous vehicle development has been conducted with a notable finding that autonomous vehicles will inevitably become an indispensable future greener solution. Subsequently, 5 different deep learning models, YOLOv5s, EfficientNet-B7, Xception, MobilenetV3, and InceptionV4, have been built and analyzed for 2-D object recognition in the navigation system. While testing on the BDD100K dataset, YOLOv5s and EfficientNet-B7 appear to be the two best models. Finally, this study has proposed Hessian, Laplacian, and Hessian-based Ridge Detection filtering techniques to optimize the performance and sustainability of those 2 models. The results demonstrate that these filters could increase the mean average precision by up to 11.81%, reduce detection time by up to 43.98%, and significantly reduce energy consumption by up to 50.69% when applied to YOLOv5s and EfficientNet-B7 models. Overall, all the experiment results are promising and could be extended to other domains for semantic understanding of the environment. Additionally, various filtering algorithms for multiple object detection and classification could be applied to other areas. Different recommendations and future work have been clearly defined in this study.
Over the past few decades, the demand for Data Center (DC) services has significantly increased due to the world's growing need for internet access, social networking, and data storage. Data Centers are among the most energy-intensive businesses, so optimizing IT operations in DC requires energy-efficient techniques. This paper presents AI based modeling strategies for effective energy management with a particular emphasis on DC's two most energy intensive systems (i.e., cooling and IT systems). This study addresses the issues of IT equipment performance degradation, inappropriate IT room thermal conditions, inefficient workload placement, and excessive energy waste. This research entails the application of machine learning for DC thermal classification, and deployment of deep learning models to predict resource utilization and energy consumption in DC. Furthermore, a comparative analysis is performed with existing relevant methods to demonstrate the effectiveness and accuracy of the proposed AI techniques. The findings of this study also provide evidence-based recommendations for DC efficient energy management.
In the classical Projection-based Model for cardinal directions [6], a two-dimensional Euclidean space relative to an arbitrary single-piece region, a, is partitioned into the following nine tiles: North-West, NW(a); North, N(a); North-East, NE(a); West, W(a); Neutral Zone, O(a);East, E(a); South-West, SW(a); South, S(a); and South-East,SE(a). In our Horizontal and Vertical Constraints Model [9], [10] these cardinal directions are decomposed into sets corresponding to horizontal and vertical constraints. Composition is computed for these sets instead of the typical individual cardinal directions. In this paper, we define several whole and part direction relations followed by showing how to compose such relations using a formula introduced in our previous paper [10]. In order to develop a more versatile reasoning system for direction relations, we shall integrate mereology, topology, cardinal directions and include their negations as well. © 2010 Springer-Verlag.
In existing literature, Semantic Web portals (SWPs) are sometimes known as semantic portals or semantically enhanced portals. It is the next generation Web portal which publishes contents and information readable both by machines and humans. A SWP has all the generic functionalities of a Web portal but is developed using semantic Web technologies. However, it has several enhanced capabilities such as semantics- based search, browse, navigation, automation processes, extraction, and integration of information (Lausen, Stollberg, Hernandez, Ding, Han & Fensel, 2004; Perry & Stiles, 2004). To date the only available resources on SWPs are isolated published Web resources and research or working papers. There is a need to pool these resources together in a coherent way so as to provide the readers a comprehensive idea of what SWPs are, and how they could be built, and these will be supported by some appropriate examples. Additionally, this article will provide useful Web links for more extensive as well as intensive reading on the subject.
Syllogistics is a type of logical reasoning which involves quantifiers such as All, and Some. Here, we explore the use of syllogisms to formalize quantified direction relations by incorporated them into the classical Projection-based Model for cardinal directions [Frank,1992], a two-dimensional Euclidean space relative to an arbitrary single-piece region, a, is partitioned into the following nine tiles: North-West, nw(a); North, n(a); North-East, ne(a); West, w(a); Neutral Zone, o(a); East, e(a); South-West, sw(a); South, s(a); and South-East, se(a). Typically, only these tiles are employed for reasoning about cardinal direction relations [Ligozat,1988; Goyal and Egenhofer, 2000;.Skiadopoulos and Koubarakis, 2001,2004]. Chen et. al (2007) have integrated cardinal direction relations and RCC-5 and RCC-8 to represent directional and topological information. They have investigated the mutual dependencies them, and discussed the composition of the hybrid relations. However, in this paper, we have developed formalisms to facilitate an expressive reasoning mechanism for spatial databases. These formalisms is the outcome of the integration of RCC-5 [Cohn, et.al,1997] for topological relations, the Project-based Model for cardinal direction relations and syllogisms. In the latter part of the paper, we demonstrate how to compute the composition of such hybridized cardinal direction relations.
We have shown how the nine tiles in the projection-based model for cardinal directions can be partitioned into sets based on horizontal and vertical constraints (called Horizontal and Vertical Constraints Model) in our previous papers (Kor and Bennett, 2003 and 2010). In order to come up with an expressive hybrid model for direction relations between two-dimensional single-piece regions (without holes), we integrate the well-known RCC-8 model with the above-mentioned model. From this expressive hybrid model, we derive 8 basic binary relations and 13 feasible as well as jointly exhaustive relations for the x- and y-directions, respectively. Based on these basic binary relations, we derive two separate composition tables for both the expressive and weak direction relations. We introduce a formula that can be used for the computation of the composition of expressive and weak direction relations between “whole or part” regions. Lastly, we also show how the expressive hybrid model can be used to make several existential inferences that are not possible for existing models.
Use of Philosophy, Epistemology, and Ontology to Provide Deeper Understanding of Knowledge Management
Reasoning Mechanism for cardinal directions in Geographical Information Systems
This paper is a positioning paper which outlines a proposal for engaging in the evaluation of eGovernment systems. The primary purpose of our proposed research is to develop, apply, test, and disseminate an evaluation framework which can support continuous, adaptable, and reflective evaluation of eGovernment systems. The theoretical bases for the methodology will be the Information Systems (IS), Soft Systems Methodology, SSM (Checkland and Scholes, 1990) which provides the platform for the analyses of the ‘soft’ aspects (e.g. human, political, cultural and organisational factors) and the Hard Systems Methodology (HSM) which provides methods and tools for quantitative measures and analyses of the system. A further three interrelated bases are: Reflective Practice, Organisational Learning (OL), and Information and Knowledge Management (IKM). Some of the key underlying principles to a successful evaluation framework are good data collection and analyses methods, an evaluative reflective practice approach whichentails the complete process of identification and analysis of strengths and problems, followed by rigorous testing, implementation, and revision of solutions. Such a cycle encourages organisational learning and promotes continuous improvement to both the evaluation framework and system. Additionally, it aims to cultivate an organisational culture that supports evaluation through reflection, continuous learning, and knowledge management which facilitates knowledge creation, capture, sharing, application and dissemination.
Reasoning about Cardinal Directions
In our previous paper (Kor and Bennett, 2003), we have shown how the nine tiles in the projection-based model for cardinal directions can be partitioned into sets based on horizontal and vertical constraints (called Horizontal and Vertical Constraints Model). In order to come up with an expressive hybrid model for direction relations between two-dimensional single-piece regions (without holes), we integrate the well-known RCC-8 model with the above-mentioned model. From this expressive hybrid model, we derive 8 atomic binary relations and 13 feasible as well as jointly exhaustive relations for the x and y directions respectively. Based on these atomic binary relations, we derive two separate 8x8 composition tables for both the expressive and weak direction relations. We introduce a formula that can be used for the computation of the composition of expressive and weak direction relations between ‘whole or part’ regions. Lastly, we also show how the expressive hybrid model can be used to make several existential inferences that are not possible for existing models.
In a language-deficient domain such as buoyancy, students generally find it difficult to explain phenomena that daily saturate their lives such as sinking and floating. To address this problem, we propose a simple and object-related articulation and reflection tool which is embedded in the BSL System (B stands for Body while S and L are for String and Liquid respectively). An analysis of the findings reveals that generally, the use of the tool decreased with respect to time. Evidence also shows that contents in the tool is either adapted or misused. Finally, evidence suggests positive changes in students’ conceptual knowledge of B and S but not L.
In this paper, we demonstrate how to group the nine cardinal directions into sets and use them to compute a composition table. Firstly, we define each cardinal direction in terms of a certain set of constraints. This is followed by decomposing the cardinal directions into sets corresponding to the horizontal and vertical constraints. We apply two different techniques to compute the composition of these sets. The first technique is an algebraic computation while the second is the typical technique of reasoning with diagrams. The rationale of applying the latter is for confirmation purposes. The use of typical composition tables for existential inference is rarely demonstrated. Here, we shall demonstrate how to use the composition table to answer queries requiring the common forward reasoning as well as existential inference. Also, we combine mereological and cardinal direction relations to create a hybrid model which is more expressive.
From A to <A> Keywords of Markup
Sustainable development and green computing issues are increasingly important for computing professionals. Preparing the next generation of implementers and developers requires that Higher Education providers develop curriculum to reflect this. There are a number of drivers behind this – encompassing industry and professional body demands, policy makers’ directives, as well as institutional commitments to sustainability. Other reasons include the effectiveness of this topic as a way to address particular issues in engagement and recruitment to computing courses. When considering “sustainability" or ("green-ness") in teaching, we should consider the appropriateness and suitability of material, and target it at an appropriate level. There are also choices about how to present the material so as to match students’ motivation, which can reflect gender and other demographic issues. Institutions can adopt different approaches, such as specialised courses as components of wider programmes, or as specialised courses in their own right. Some approaches integrate the topic into undergraduate teaching, treating environmental impact as a design constraint within a solution. This paper reports on some of these variations and directs readers to an online resource to enable colleagues interested in this topic to share ideas and approaches. Whilst the focus is on computing, many of the issues are transferrable to other STEM disciplines.
Education in Green ICT and Control of Smart Systems : A First Hand Experience from the International PERCCOM Masters Programme
PERCCOM (PERvasive Computing and COMmunications in sustainable development) Masters is the first innovative international programme in Green ICT for educating and equipping new IT engineers with Green IT skills for sustainable digital applications design and implementation. After five years of running the PERCCOM programme, this paper provides an assessment of skills and employability in the context of Green jobs and skills. The paper ends with a list of recommendations for the development of environment related education curricula.
Sustainability of Machine Learning Models: An Energy Consumption Centric Evaluation
Machine Learning (ML) algorithms have become prevalent in today's digital world. However, training, testing and deployment of ML models consume a lot of energy, particularly when the datasets are huge. Consequently, this would have a direct and adverse impact on the environment due to its Scope 2 emissions. Thus, it will be beneficial we explore the environment impact of ICT usage within an organisation. Additionally, it is vital to adopt energy consumption as a metric for the evaluation of existing and future ML models. Our research goal is to evaluate the energy consumption of a set of widely used ML classifier algorithms- Logistic Regression, Gaussian Naive Bayes (GNB), Support vector, K Neighbors (KNN), Decision Tree (DT), Random Forest, Multi-Layer Perceptron, AdaBoost, Gradient Boosting, Light GBM and CatBoost classifiers. The findings will provide evidence-based recommendation for sustainable and energy-efficient ML algorithms. The experiment findings shows that GNB classifer consumes only 63 J/S energy, which is the lowest among all models whereas widely used KNN and DT classifiers consume 3 to 10 times more than the rest.
Emotion assessment is a challenging task in the human-computer interaction interface. Previous studies have examined the relationship between emotion and color, but they fail to accurately analyze emotional semantics due to the numerous elements in human-computer interaction interfaces. As a result, a combination model of a backpropagation neural network (BPNN) and an artificial bee colony algorithm (ABC) was presented in this paper to predict the emotion semantics of the human-computer interaction interface. The mechanism of generating the weights and thresholds for each layer of BPNN was converted to the search for an optimal honey source. Meanwhile, according to experiment results and evaluation of elements in human-computer interaction interfaces, this paper has assessed the relationships amongst the eight key elements (ratio of graphics to text, color difference, color distribution, color harmony, theme style, white space ratio, frame style, number of colors) and emotion word pairs (moderation-fancy, calm-pleasure, confusing-clear, cold-kind, coarse-elegant). Furthermore, an emotion application database was established to determine how the amalgamation of critical elements affects the users’ feelings about the human-computer interaction interface to help designers build a user-centric interface. Finally, the database can be applied to relieve mental health problems by meeting the psychological expectations of users as mental healthcare intervention during the COVID-19 pandemic. Also, it can help designers to design a pleasurable visual interaction interface for a particular element to convey health-related information and protective measures.
Analyzing and detecting human intensions and emotions are important means to improve the communication between users and machines in the areas of human-computer interaction (HCI) and human-robot interaction (HRI). Despite significant progress in utilizing state-of-the-art (SOTA) Transformer-based models, various obstacles persist in managing complicated input interdependencies and extracting intricate contextual semantics. Moreover, it lacks practical applicability and struggles to accurately capture and effectively manage the inherent complexity and unpredictability of human emotions. In recognition of the identified research gaps, we introduce a robust and innovative fuzzy multi-modal Transformer (FMMT) model. Our novel fuzzy Transformer model uniquely heightens the comprehension of emotional contexts by concurrently analyzing audio, visual, and text data through three distinct branches. By incorporating fuzzy mathematic theory and introducing a unique temporal embedding technique to trace the evolution of emotional states, it effectively handles the inherent uncertainty in human emotions, thereby filling a significant void in emotional AI. Building upon the FMMT model, we further explored the emotion expression approach. Furthermore, performance comparison analysis with SOTA baseline methods and detailed ablation study were performed. The results show that the proposed FMMT is better than the baseline methods. Finally, we conducted detailed experimental verification and empirical analyses of the practicality of the designed method by verifying uncertainty emotion and analyzing emotional state transitions combined with personalized factor. Overall, our research makes a significant contribution to emotion analysis through the implementation of a novel fuzzy Transformer model. This model enhances emotion perception and advances the methods for analyzing emotional expression, thus setting an edge over prior studies.
Exploring in the Application Slicing Technology to Determine Spatial Information to Assist and Facilitate Robotic Disassembly
As lowering manufacturing cost and improving resource sustainability is becoming more critical, remanufacturing has attained significant attention in the manufacturing industry. Disassembly is the first step and consumes the most time in remanufacturing. There are three ways to disassemble a part and they are: manual disassembly (MD), human–robot collaborative disassembly (HRCD), and robotic disassembly (RD). RD is a disassembly method using robots without any human intervention and is the focus of this research work. To implement RD, spatial information is crucial. As end-of-life (EOL) products are likely to be discontinued products, the digital models for these products are not likely to be available and determining spatial information using vision systems to facilitate RD may not be possible due to occlusion and inaccessibility caused by restrictive spatial environments such as car under bonnet. However, a digital model of EOL products (including their internal profiles) can be obtained using 3D scan or computed tomography (CT) scan and subsequently, sliced into layers to determine essential spatial information to facilitate RD. Therefore, this research work has proposed to explore the application of slicing technology to assist the RD of a car battery within a car engine compartment where a scanned digital model of the car engine (under the bonnet compartment) is sliced into layers and exported as a text file known as geometry-code (G-code) that contains a large amount of geometrical information that can be used to determine the geometry, location of each component and spatial availability information of an EOL product. The results show that the x, y, z coordinates extracted from G-code can be used to define the spatial availability and identify the barriers and facilitate the creation of collision free paths for RD.
Data Analytics: Factors of Traffic Accidents in the UK
The traffic and accident datasets for this research are sourced by Data.gov.uk. The data analytics in this paper comprises three levels namely: descriptive statistical analysis; inferential statistical analysis; machine learning. The aim of the data analytics is to explore the factors that could have impact on the number of accidents and their associated fatalities. Some of the factors investigated on are: time of the day, day of the week, month of the year, speed limits, etc.. Machine learning approaches have also been employed to predict the types of accident severity.
Health inequality is a widely reported problem. There is an existing body of work that links health inequality and geographical location. This means that one might be more disadvantage health-wise if one was born in one region compared to another. Existing health inequality related work in various developed and developing countries rely on population census or survey data. Effective conclusions drawn require large scale data with multiple parameters. There is a new phenomenon in countries (e.g. the UK), where governments are opening up citizen-centric data for transparency purposes and to facilitate data-informed policy making. There are many health organisations, including NHS and sister organisations (e.g. HSCIC), which participate in this drive to open up data. These health-related datasets can be exploited health inequality analytics. This work presents a novel approach of analysing health inequality in English regions solely based on open data. A methodological and systematic approach grounded in CRISP-DM methodology is adhered to for the analyses of the datasets. The analysis utilises a well-cited work on health inequality in children and the corresponding parameters such as Preterm birth, Low birth weight, Infant mortality, Excessive weight in children, Breastfeeding prevalence and Children in poverty. An authority in health datasets, called Public Health Outcomes(PHO) Framework, is chosen as a data source that contains data with these parameters. The analysis is carried out using various SAS data mining techniques such as clustering, and time series analysis. The results show the presence of health inequality in English regions. The work clearly identifies the English regions on the right and wrong side of the divide. The policy and future work recommendations based on these findings are articulated in this research. This work presented in this paper is novel as it applies SAS based BI techniques to analyse health inequality for children in the UK solely based on open data.
This study attempts to model smoking behavior in the United States using Current Population Survey data from 2010 and 2011. An array of demographic and socioeconomic variables is used in an effort to explain smoking behavior from roughly 139,000 individuals. Two regression techniques are employed to analyze the data. These methods found that individuals with children are more likely to smoke than individuals without children; females are less likely to smoke than males; Hispanics, blacks, and Asians are all less likely to smoke than whites; divorcees and widows are more likely to smoke than single individuals; married individuals are less likely to smoke than singles; retired individuals are less likely to smoke than working ones; unemployed individuals are more likely to smoke than working ones; and as education level increases after high school graduation, smoking rates decrease. Finally, it is recommended that encouraging American children to pursue higher education may be the most effective way to minimize cigarette smoking.
Design of Multi-protocol Fast Charging Charger for Mobile Smart Devices Based on Gallium Nitride
Fast chargers play an important role in supporting smart city development by enabling access to mobile applications, location-based services, and real-time data systems. As mobile internet technology advances, smartphones are becoming increasingly powerful, but their energy consumption continues to rise. However, the lack of compatibility between fast-charging protocols from different manufacturers presents a challenge for universal charging solutions. To address this, a multi-protocol fast charging charger has been designed to support a range of fast charging technologies. The charger adopts a quasi-resonant (QR) flyback architecture with a GaN power stage and a dedicated controller. This design allows for higher efficiency and compact size, making it well-suited for modern mobile device demands within the broader context of smart city infrastructure. In this work, an experimental prototype was successfully built. The experimental results show that the fast-charging charger can achieve a fast-charging function with a maximum output power of 30 W and has Type-C and Type-A interfaces. It is convenient for fast charging of various data lines. And support 90Vac ~ 264 Vac wide voltage input in line with the AC input of various countries. The Type-C interface supports up to 20 V/1.5 A output. It meets the limits of U.S. DOE Level VI and EU Regulation 2019/1782 in our measurements. In addition, the compatibility of the charging and discharging fast charging protocol of the system prototype has been tested, and it can be quickly charged with a variety of mobile phones that support the fast-charging protocol. It has advantages of high-power density, high conversion efficiency and strong compatibility.
This paper presents a comprehensive theoretical and computational framework for sparse ergodic control of stochastic systems with control-dependent diffusion. Existing sparse control studies are predominantly focused on finite-horizon or deterministic formulations, which limits their ability to capture long-term statistical behavior under persistent stochastic disturbances. To address this limitation, we generalize the ergodic control paradigm by incorporating a smooth approximation of the discontinuous l_0 penalty, achieving both differentiability and accurate sparsity representation. Theoretically, we establish the existence and uniqueness of the ergodic pair within a viscosity-solution framework and derive explicit error bounds for the smooth approximation. We further reveal a generalized “bang–off–bang” structure of optimal feedback policies in non-affine systems with control-dependent noise. On the computational side, we propose a Physics-Informed Neural Network (PINN) solver that effectively mitigates the curse of dimensionality and a distributed monotone algorithm for large-scale multi-agent systems, ensuring stability and consensus under network constraints. Extensive experiments on multi-robot swarm navigation and smart grid generator scheduling demonstrate that the proposed methods achieve over 85% actuation sparsity and significant cost reduction compared with conventional baselines. Overall, this work establishes a unified framework that integrates ergodic control, sparsity theory, and machine learning, offering scalable solutions for efficient and reliable long-term autonomous control under uncertainty.
Special Issue "AI for Sustainability and Innovation"
“AI for Sustainability and Innovation” aims to support GESI’s Smarter2030 initiative and the UN’s Sustainable Development Goals in order to contribute to a sustainable future. This theme encompasses theoretical and applied research to address challenges relating to society and human needs (SDG2 Zero Hunger: autonomous machines, smart, and precision agriculture to increase productivity and reduce waste; SDG3 Good Health and Well Being: smart and personalized health; SDG4 Quality Education: smart and personalized educational technologies), sustainable amenities and utilities for the environment (SDG7 Affordable and Clean Energy: smart grid, smart microgrid, and smart renewable energy management system; SDG13 Climate Change via Low Carbon Growth: smart technologies to reduce energy, as well as resource consumption and waste emissions; SDG11 Sustainable Cities and Communities: smart sustainable cities and infrastructure), and sustainable industry (SDG9 Industry, Innovation, and Infrastructure: smart technologies to support Industry 4.0; SDG12 Responsible Consumption and Production: smart technologies for resource optimization, energy efficiency, and waste reduction). In summary, this Special Issue, titled “AI for Sustainability and Innovation”, calls for AI-enabled research (position, theoretical, or applied) that address relevant SDGs across the following sectors (listed in GESI’s Smarter 2020 initiative): business, power, transportation, manufacturing, services (education and health), agriculture, and buildings.
Modelling fear of crime area variation of people in England
This research exploits multilevel modelling, large datasets availability, Bayesian statistics, In short, ignoring levels in large hierarchical datasets such as the BCS through traditional statistical approaches can produce incorrect conclusions where inferences about individual characteristics are drawn from aggregate data, known as ‘ecological fallacy’. and machine learning to provide valuable insights into the fear of crime phenomenon. At present, the quantitative perspective of fear of crime is constrained by methodological limitations such as measures validation challenges, qualitative data-related triangulation, large geographical areas samples, and lack of clarity surrounding the dimensions of fear of crime. This research employs quantitative hierarchical multilevel models based on the Markov Chain Monte Carlo (MCMC) approach. MCMC helps analyse the BCS within smaller geographical areas to establish a quantitative Bayesian framework and examines demographic and residents’ area characteristics for the fear of crime analysis. All multilevel models have avoided collapsing categories in the survey responses using ordered categorical outcome variables. This research findings reveal residents in England seem to have two distinct dimensions of the fear of crime. Firstly, the emotional dimension is related to three specific worries of crime. Secondly, the cognitive dimension is associated with the probability of becoming a crime victim. This research finding suggests that educated females are less worried about crime. However, education makes no difference in females’ perceived likelihood of criminal victimisation. Another key finding is that fear of crime in a LAD strongly correlates with out-migration. This research enhances our understanding of the link between fear of crime and multiple deprivations of local areas. A key finding from the machine learning models is that the model with all the area-level (smaller and larger areas) covariates has better classification accuracy than the model based on the demographic data alone. Like multilevel statistical models, machine learning models also confirm that multiple household victimisation experience is a crucial variable in the cognitive dimension of fear of crime, e.g., perceived likelihood of criminal victimisation. It contributes to a practical application of machine learning for modelling the fear of crime phenomenon.
This paper presents a mixed resolution stereo video coding model for High Efficiency Video Codec (HEVC). The challenging aspects of mixed resolution video coding are enabling the codec to encode frames with different frame resolution/size and using decoded pictures having different frame resolution/size for referencing. These challenges are further enlarged when implemented using HEVC, since the incoming video frames are subdivided into coding tree units. The ingenuity of the proposed codec’s design, is that the information in intermediate frames are down-sampled and yet the frames can retain the original resolution. To enable random access to full resolution decoded frame in the decoded picture buffer as reference frame a downsampled version of the decoded full resolution frame is used. The test video sequences were coded using the proposed codec and standard MV-HEVC. Results show that the proposed codec gives a significantly higher coding performance over the MV- HEVC codec.
Issues surrounding knowledge and information dissemination for domestic workers in Malaysia
Large numbers of Indonesian women seek employment as domestic workers in Malaysia in order to escape poverty and unemployment. In most cases the decision to work abroad is made without being properly informed about what to expect. Most Indonesian migrant domestic workers do not know about process, procedures or their rights and possibilities to seek assistance when problems occur. The aim of the paper is to investigate the knowledge and information needs of Indonesian domestic workers in Malaysia and to suggest an information dissemination strategy which is appropriate to support these women. Major interest groups are identified, information needs analysed, challenges for the access to information critically evaluated, and finally appropriate strategies to disseminate information are recommended.
This is a theoretical chapter which aims to integrate various epistemologies from the philosophical, knowledge management, cognitive science, and educational perspectives. From a survey of knowledgerelated literature, this chapter collates diverse views of knowledge. This is followed by categorising as well as ascribing attributes (effability, codifiability, perceptual/conceptual, social/personal) to the different types of knowledge. The authors develop a novel Organisational Information and Knowledge Management Model which seeks to clarify the distinctions between information and knowledge by introducing novel information and knowledge conversions (information-nothing, information-information, information-knowledge, knowledge-information, knowledge-knowledge) and providing mechanisms for individual knowledge creation and information sharing (between individual-individual, individual-group, group-group) as well as Communities of Practice within an organisation. © 2011, IGI Global.
Purpose – The purpose of this paper is to propose a more realistic view of innovation in local government. A key element in this is the notion of innovation value based on people, processes and technology. Design/methodology/approach – The objectives are achieved by reviewing the background literature, a recent study of eGoverment achievement in the UK – VIEGO, and government assessments of innovation in both the EU and the UK. Some empirical evidence of the inherent complexity is also used. Findings – Extant models of innovation tend to focus on the private sector values and their transfer to the public sector is questionable. This with combined with a weak approach to evaluation leaves local government vulnerable. Originality/value – The political rhetoric that accompanied the introduction of eGovernment expected it to produce innovation in the way government agencies conducted themselves. It is assumed that this innovation is both “good” and inevitable. This paper challenges these simplistic assumptions and proposes a more realistic view.
This book chapter is an extension of (Bazarhanova et al., Belief rule-based environmental responsibility assessment for small and medium-sized enterprises (note: this includes a comparison with fuzzy logic), Proceedings of 2016 IEEE Future Technologies Conference, 6–7 December, San Francisco, US, 2016; Bazarhanova et al., A Belief Rule-Based Environmental Responsibility Assessment System for Small and Medium-Sized Enterprises (note: without comparison with fuzzy logic), International SEEDS Conference, 14–15th September, 2016, Leeds. (Won Highly Commended Award for Green Infrastructure Category, 2016)) and adaptation from (Bazarhanova, A Belief Rule-Based Environmental Responsibility Assessment System for Small and Medium-Sized Enterprises, An unpublished Masters Degree Dissertation, Leeds Beckett University. URL: https://www.doria.fi/bitstream/handle/10024/124773/Thesis%20Bazarhanova.pdf?sequence=2, 2016). This chapter proposes the use of belief rule-based (BRB) inference engine for Environmental Responsibility assessment in small- and medium-sized enterprises. Such a context-adapted approach is believed to generate well-balanced, sensible, and Green ICT readiness-adapted results, to help enterprises focus on making improvement on more sustainable business operations.
Hardware diversity and modified NUREG/CR-7007 based assessment of NPP I&C safety
Diversity and subdiversity-oriented systems applied in safety critical industry systems are analyzed through the use of the classification scheme described in standard NUREG7007. This classification is specified considering diversity of hardware and FPGA designs. In particular, diversity of hard logic and soft processors, interfaces and buses, self-diagnostics means, etc... are described. Impact of hardware/FPGA diversity on safety of Instrumentation and Control systems (I&Cs) are analyzed. The technique for selecting such diversity is presented.
A mobile ad hoc network (MANET) is a self-configuring wireless network in which each node could act as a router, as well as a data source or sink. Its application areas include battlefields and vehicular and disaster areas. Many techniques applied to infrastructure-based networks are less effective in MANETs, with routing being a particular challenge. This paper presents a rigorous study into simulation techniques for evaluating routing solutions for MANETs with the aim of producing more realistic simulation models and thereby, more accurate protocol evaluations. MANET simulations require models that reflect the world in which the MANET is to operate. Much of the published research uses movement models, such as the random waypoint (RWP) model, with arbitrary world sizes and node counts. This paper presents a technique for developing more realistic simulation models to test and evaluate MANET protocols. The technique is animation, which is applied to a realistic scenario to produce a model that accurately reflects the size and shape of the world, node count, movement patterns, and time period over which the MANET may operate. The animation technique has been used to develop a battlefield model based on established military tactics. Trace data has been used to build a model of maritime movements in the Irish Sea. Similar world models have been built using the random waypoint movement model for comparison. All models have been built using the ns-2 simulator. These models have been used to compare the performance of three routing protocols: dynamic source routing (DSR), destination-sequenced distance-vector routing (DSDV), and ad hoc n-demand distance vector routing (AODV). The findings reveal that protocol performance is dependent on the model used. In particular, it is shown that RWP models do not reflect the performance of these protocols under realistic circumstances, and protocol selection is subject to the scenario to which it is applied. To conclude, it is possible to develop a range of techniques for modelling scenarios applicable to MANETs, and these simulation models could be utilised for the evaluation of routing protocols.
Adoption of Social Media as Communication Channels in Government Agencies
Social media has become an integral part of many people's lives around the world. The main use of this communication channel is to connect with social circles. It is also widely used for commercial and business purposes. Governments are also keen to use social media as an alternative to the traditional communication channels. Nonetheless, when the level of use of social media in the government is compared to other fields, a clear gap becomes apparent. This chapter investigates the adoption of social media as a communication channel between citizens, public agencies and government departments; and considers a wide range of factors that affect the issue from the perspective of public agencies. This chapter presents an extensive literature review and proposes a framework that organises the critical factors that affect public agencies' efforts while implementing social media. We also provide a list of hypotheses to validate and evaluate the significance of these factors.
Evaluation of ICT Environmental Impact for an SME
ICT has tremendously evolved in the last decade and has now focused on sustainability. This research aims to evaluate the environmental impact of ICT for an SME. Existing ICT equipment of that organisation will be examined methodologically using ISO 14040 guidelines, the SusteIT ICT Carbon footprint Tool, the Environmental Paper Network paper calculator. Interviews have also been conducted to support the audit. In order to evaluate the environmental impacts of ICT, electricity consumption, carbon footprint and paper usage will be examined.
Effective Green IT Strategy in a UK Higher Education Institute
© 2016 IEEE. The aim of this research is to investigate the implementation of green IT strategy in a UK university. To achieve this aim, we investigate the effectiveness of the implementation of such a strategy through the benefits gained, such as saving money, energy and reduce carbon emissions through the use of environmentally friendly devices and reduce the use of printers and papers, as well as e-waste management and some of the difficulties encountered when applying this strategy. The method used to collect the data was interview, which is suitable for this research because it is qualitative research that need to be accurate and specific, this research provide the recommendations for the university and other universities to encourage the achievement of this strategy to be environmentally effective and efficient.
An Evaluation of the Impact of Remote Collaboration Tools on Corporate Sustainability
This research aims to assess and analyse the impact on sustainability of collaborative video-conferencing ICT capabilities within a commercial setting for a large global organisation. A list objectives are: to audit the current operating model from an ICT perspective of the IT Architecture department, to investigate the extent and nature of use of ICT video-conferencing capabilities and to quantify the proportion of meetings undertaken using it, to provide a synopsis of current literature in the domain of greening by IT, specifically those contributions relating to the use of video-conferencing ICT capabilities, to critically evaluate the opportunities presented by video-conferencing ICT capabilities to contribute to more sustainable practices. Based on our findings, the effects of ICT equipment for video conferencing is as follows: (i) positive primary enabling impact of 15932.83 tCO2e through reduced travels, (ii) direct emissions due to the use of the ICT equipment is 26.47 tCO2e, (iii) calculated net enabling effect is a saving of 15,906.36 tCO2e.
Energy Efficiency of 4th Gen Intel ® Core ™ Processor vs 3rd Gen Intel ® Core ™ Processor
This paper is an extended version of [24]. This research aims to compare the energy efficiency in between two generations Intel processors; the 4th Gen Intel ® Core ™ Processor and 3rd Gen Intel ® Core ™ Processor. It also surveys the technologies that provide better energy performance for both of the processors. The methodology used for this research is a physical experiment conducted in an Intel production plant. The results obtained from the experiment show that the 4th Gen Intel ® Core ™ Processor is more energy efficient than the 3rd Gen Intel ® Core ™ Processor.
This paper is an extension of [1] and abstracted from [43]. Battery consumption in mobile applications development is a very important aspect and has to be considered by all the developers in their applications. This study will present an analysis of different relevant concepts and parameters that may have impact on energy consumption of Windows Phone applications. This operating system was chosen because there is limited research even though there are related studies for Android and iOS operating systems. Furthermore, another reason is the increasing number of Windows Phone users. The objective of this research is to categorise the energy consumption parameters (e.g. use of one thread or several thread for the same output). The result for each group of experiments will be analyzed and a rule will be derived. The set of derived rules will serve as a guide for developers who intend to develop energy efficient Windows Phone applications. For each experiment, one application is created for each concept and the results are presented in two ways: a table and a chart. The table presents the duration of the experiment, the battery consumed in the experiment, the expected battery lifetime and the energy consumption, while the charts display the energy distribution based on the main threads: UI thread, application thread and network thread.
Java has dominated the ICT market for almost thirty years with various applications in nearly every sector all over the world. One of Java's main drawbacks comes from its heavyweight core - the Java Virtual Machine (JVM). Therefore, several JVM distributions have been developed to address this issue. GraalVM is the most promising amongst the recent distributions, providing better performance, low power consumption, and reduced carbon footprint emissions. In this research, a comparative analysis based on performance and energy efficiency metrics was conducted to assess this JVM distribution in light of three other classic JVM distributions: Amazon Corretto, Adopt OpenJDK, and Zulu. Findings showed that, although there was no significant difference between the test candidates, GraalVM seemed to be the leading JVM distribution. It is recommended that programmers and technology businesses consider adopting GraalVM in their future Java applications because of its energy efficiency.
Automated expert systems provide decision support for certain subject domain by prompting users for answers. The answers could either be a ‘yes’, ‘no’, or a selected response from multiple choice items. To date, there is no account of any expert system which allows a user to defer or revise his/her response to a particular question. Our Rules-Based Guidance (RBG) System provides such flexibility to the user. At the same time, its recommendation (or conclusion) is updated accordingly to the revised or new input by user. Legislation compliance is complex and thus rules represented in the knowledge are complex Boolean expressions. Easily comprehensible and intuitive information visualization is necessary to help a lay user understand the reasoning process, derivation of the final recommendation, and also to revisit the state of his/her responses to a fixed set of questions. Thus, RBG System has provided functionality for transforming Boolean expressions (with user associated inputs) into a Directed Acyclic Graph (DAG).
The article is an extension of this paper 1 . It describes methods for dealing with reliability and fault tolerance issues in cloud-based datacenters. These methods mainly focus on the elimination of a single point of failure within any component of the cloud infrastructure, availability of infrastructure and accessibility of cloud services. Methods for providing the availability of hardware, software and network components are also presented. The analysis of the actual accessibility of cloud services and the mapping of a cloud-based datacenter infrastructure with different levels of reliability to the Tier Classification System2 is described. Non-compliance of the actual accessibility with the level of High Availability for cloud web services is unraveled.
The main goal of this proposed project is to harness the emerging IoT technology to empower elderly population to self-manage their own health, stay active, healthy, and independent as long as possible within a smart and secured living environment. An integrated open-sourced IoT ecosystem will be developed. It will encompass the entire data lifecycle which involves the following processes: data acquisition, data transportation; data integration, processing, manipulation and computation; visualisation; data intelligence and exploitation; data sharing; data storage. This innovative cloud-based IoT ecosystem will provide a one-stop shop for integrated smart IoT-enabled services to support older people (greater or equal to 65 years old) who live alone at home (or care homes). Another innovation of this system is the design and implementation of an integrated IoT gateway for wellbeing wearable and home automation system sensors with varying communication protocols. The SMART-ITEM system and services will appropriately address the following (i) smart health and care; (ii) smart quality of life; (iii) SMART-ITEM social community. The development of the system will be based on the User Centred Design methodology so as to ensure active user engagement throughout the entire project lifecycle and necessary standards as well as compliances will be adhered to (e.g. security, trust and privacy) in order to enhance user acceptance.
This paper is an extension of [17]. This research proposes the use of a Belief Rule-Based approach to assess an enterprise’s level commitment to environmental issues. Participating companies will have to complete a structured questionnaire. An automated analysis of their responses will determine their environmental responsibility level. This is followed by a recommendation on how to progress to the next level. The recommended best practices will help promote understanding, increase awareness, and make the organization greener. BRB Expert systems consist of two parts: Knowledge Base and Inference Engine, which are used to derive valid conclusions from rules, established by experts with domain-specific knowledge. The knowledge base in this research is constructed after an in-depth literature review, critical analyses of existing environmental performance assessment models and primarily guided by the EU Draft Background Report for the development of an EMAS Sectoral Reference Document on "Best Environmental Management Practice in the Telecommunications and ICT Services Sector". The reasoning algorithm of a selected Drools JBoss BRB inference engine is forward chaining. However, the forward chaining mechanism is not equipped with uncertainty handling. Therefore, a decision is made to deploy an evidential reasoning and forward chaining with a hybrid knowledge representation inference scheme to accommodate imprecision, ambiguity and fuzzy types of uncertainties. It is believed that such a system generates well balanced, sensible and Green ICT readiness adapted results, to help enterprises focus on making improvements on more sustainable business operations.
A rapidly emerging trend in the IT landscape is the uptake of large-scale datacenters moving storage and data processing to providers located far away from the end-users or locally deployed servers. For these large-scale datacenters, power efficiency is a key metric, with the PUE (Power Usage Effectiveness) and DCiE (Data Centre infrastructure Efficiency) being important examples. This article proposes a belief rule based expert system to predict datacenter PUE under uncertainty. The system has been evaluated using real-world data from a data center in the UK. The results would help planning construction of new datacenters and the redesign of existing datacenters making them more power efficient leading to a more sustainable computing environment. In addition, an optimal learning model for the BRBES demonstrated which has been compared with ANN and Genetic Algorithm; and the results are promising.
Internet of Things (IoT) is an outcome of the emanating third wave of development of the Internet. Big Data Analytics in IoT provide valuable insights for Smart and Sustainable systems. Cloud Data Centers deliver on-demand computing resources for processing voluminous data. Servers that are provisioned for this purpose consume enormous amount of energy contributing to 2% of the global Carbon-dioxide (CO2) emissions. IoT energy concerns are addressed by research in low-power sensors and improved Machine-to-Machine communications. However, Cloud Data Centers still face energy crisis. This work attempts to analyze the energy behavior of compute hosts on applying Virtual Machine (VM) Consolidation in a Multi-node Openstack Cloud. Several works on VM consolidation is evaluated for simulated workload but this work aims to study the performance in a real cloud infrastructure with a big data workload. The preliminary results of this research are presented in this paper.
Green economics: A roadmap to sustainable ICT development
© 2018 IEEE. The paper discusses a systematic approach to sustainable development. It puts forward an idea of analysing energy efficiency and sustainability of a particular product, service or even a process during the whole life-cycle. Minor carbon footprint or low energy consumption of a product during its operation or exploitation does not necessary mean that the product manufacturing, decommissioning and disposal are also sustainable. In this paper, we discuss a set of sustainable principles and propose a graphical notion describing key factors of product/process sustainability. We also consider information and communication technologies (ICT) as essential tools of sustainable development in various application domains. On the other hand, ICT themselves should be considered as an object of energy efficiency improvement. The paper discusses ICT impact on the environment and identifies the fundamental green ICT trade-off between dependability, performance and energy consumption. Finally, we consider problems and propose approaches to building green clouds and datacenters.
IoT-Enabled Smart Living
This book chapter is an extended version of (Kor et al., SMART-ITEM: IoT-Enabled Living. Proceedings of IEEE Future Technologies Conference 2016, San Francisco, 6–8 December, 2016). The main goal of this proposed project is to harness the emerging IoT technology to empower elderly population to self-manage their own health and stay active, healthy, and independent as long as possible within a smart and secured living environment. An integrated open-sourced IoT ecosystem will be developed. It will encompass the entire data life cycle which involves the following processes: data acquisition and data transportation; data integration, processing, manipulation, and computation; visualization; data intelligence and exploitation; data sharing; and data storage. This innovative cloud-based IoT ecosystem will provide a one-stop shop for integrated smart IoT-enabled services to support older people (greater or equal to 65 years old) who live alone at home (or care homes). Another innovation of this system is the design and implementation of an integrated IoT gateway for well-being wearable and home automation system sensors with varying communication protocols. The SMART-ITEM system and services will appropriately address the following: (i) smart health and care, (ii) smart quality of life, and (iii) SMART-ITEM social community. The development of the system will be based on the user-centered design methodology so as to ensure active user engagement throughout the entire project life cycle and necessary standards as well as compliances will be adhered to (e.g., security, trust, and privacy) in order to enhance user acceptance.
Evidence suggests that physical activity brings substantial health benefits while its absence causes several health issues. As people become more aware of negative health outcomes associated with physical inactivity, the shift from sedentary lifestyles to healthier ones occurs, and physical activity tracking apps may help in this regard. While mobile applications for tracking physical activity are abundant, most of of them fail to deliver evidence-based recommendations. This is a major drawback especially when these apps are designed to guide users towards healthy lifestyles. This paper presents a prototype application that could provide evidence-based recommendations about how much physical activity adults should do to stay healthy according to the user’s current activity level. A new visualisation approach which uses animal representations for activity levels is also introduced to enhance user experience, increase motivation and create a good base for further integration of gamification principles. Early testing showed that users found the prototype very useful and expressed great interest towards the animal representations.
It is now widely accepted that human behaviour accounts for a large portion of total global emissions, and thus influences climate change to a large extent (IPCC, 2014). Changing human behaviour when it comes to mode of transportation is one component which could make a difference in the long term. In order to achieve behavioural change, we investigate the use of a persuasive multiplayer game. Transportation mode recognition is used within the game to provide bonuses and penalties to users based on their daily choices regarding transportation. Preliminary results from testers of the game indicate that using games may be successful in causing positive change in user behaviour.
Cloud computing is emerging as a methodology for delivering more energy efficient computing provision. The potential advantages are well-known, and are primarily based on the opportunities to achieve economies of scale through resource sharing: in particular, by concentrating data storage and processing within data centers, where energy efficiency and measurement are well established activities. However, this addresses only a part of the overall energy cost of the totality of the cloud: energy is also required to power the networking connections and the end user systems through which access to the data center is provided, and researchers are beginning to recognize this. One further aspect of cloud provision is less well understood: the impact of application software behavior on the overall system’s energy use. This is of particular concern when one considers the current trend towards “off the shelf” applications accessed from application stores. This mass market for complete applications, or code segments which are included within other applications, creates a very real need for that code to be as efficient as possible, since even small inefficiencies when massively duplicated will result in significant energy loss. This position paper identifies this problem in detail, and proposes a support tool which will indicate to software developers the energy efficiency of their software as it is developed. Fundamental to the delivery of any workable solution is the measurement and selection of suitable metrics, we propose appropriate metrics and indicate how they may be derived and applied within our proposed system. Addressing the potential cost of application development is fundamental to achieving energy saving within the cloud – particularly as the application store model gains acceptance.
This paper discusses the challenge of modeling in-flight startle causality as a precursor to enabling the development of suitable mitigating flight training paradigms. The article presents an overview of aviation human factors and their depiction in fuzzy cognitive maps (FCMs), based on the Human Factors Analysis and Classification System (HFACS) framework. The approach exemplifies system modeling with agents (causal factors), which showcase the problem space's characteristics as fuzzy cognitive map elements (concepts). The FCM prototype enables four essential functions: explanatory, predictive, reflective, and strategic. This utility of fuzzy cognitive maps is due to their flexibility, objective representation, and effectiveness at capturing a broad understanding of a highly dynamic construct. Such dynamism is true of in-flight startle causality. On the other hand, FCMs can help to highlight potential distortions and limitations of use case representation to enhance future flight training paradigms.
Development of a Simulation Experiment to Investigate In-Flight Startle using Fuzzy Cognitive Maps and Pupillometry
© 2019 IEEE. Loss of control in-flight (LOC-I), following loss of situational awareness and startle has been identified as a leading cause of aviation-based fatalities in recent decades. This has led to significant effort toward improving safety records, particularly in the fields of flight crew training and in-flight support technologies that aid better decision making and management of reactions to a startling occurrence. One way to achieve quality decision making in the cockpit is by providing adequate cueing and response activating mechanisms carefully designed to aid human information processing. These response performances, especially in the context of reactionary management of startle in flight, could be honed through simulator-based training. This paper discusses the setup background development of a startle causality dynamics using fuzzy cognitive mapping. This mapping provides an objective, strategy framework; for determining the required training for a management of the startle reflex in extenuating circumstances made significantly worse by human factors, such as erroneous knee jerk reactions.
JANET UK Wireless LAN Project, Technical Report. Disseminated to JANET UK as a User Manual (for managers and engineers)
Background: In 2008/09, Professor Pattinson was the supervisor for a third year project named ‘Up Stream’ (led by Richard Braddock). The main aims of the project were to develop a framework for delivering an e-learning environment in Africa (particularly Uganda and Malawi) using Free and Open Source Software (FOSS) and create a system that is transportable. This is made possible with the use of a vehicle which is kitted out with very low power (and thus ‘green’) IT equipment whilst maintaining an acceptable level of functionality. The server machine that is employed contains a core operating system of all its connected machines. Thus, the user machines do not need traditional hard drives which often fail in a mobile environment. This is considered a viable green solution because of its low power consumption. One of the challenges faced was how to use a single machine serving various roles. This was made possible by Virtualisation using VMWare ESXi to deliver the core functionality of a server within a modular, appliance led software stack . The ESXi has been chosen due to its 32Mb disk footprint. Additionally, a ‘Mobile ISP’ platform provides a wire-free and gateway connectivity. The work carried out in this student project was the basis for our proposed contribution to the JANET Wireless LAN project, exploring the use of this technology to support students on “field-trip” type activities.
The political rhetoric that accompanied the introduction of eGovernment expected it to produce innovation in the way government agencies conducted themselves with citizen and business alike. It was assumed that innovation was both "good" and inevitable. This paper challenges these assumptions and presents a more realistic model of how innovation might occurs in UK local government. The model is supported by anecdotal evidence, literature and a recent study of eGoverment achievement in the UK - VIEGO. A key element in the model is the notion of innovation value.
This is an eGISE network paper. It is motivated by a concern to develop a better approach to learning from the experience of an eGovernment project and applying that knowledge in future projects. The proposed project is based on previous work in the construction industry that developed COLA, a Cross Organisational Learning Approach. Developing a similar strategy for Knowledge Management is likely to be effective because the 'silo' culture of local government organisations has parallels with the segmented organisational structures within the construction industry.
This is a theoretical paper which aims to integrate various epistemologies from the philosophical, knowledge management, cognitive science, and educational perspectives. From a survey of knowledge-related literature, we have collated diverse views of knowledge. This is followed by categorising as well as ascribing attributes to the different types of knowledge. We have developed a novel Organisational Information and Knowledge Management Model which seeks to clarify the distinctions between information and knowledge by introducing a novel information and knowledge conversions; followed by providing mechanisms for individual knowledge creation and information sharing within an organisation.
The aim of this paper is to present the findings of a PhD research (Heinzl, 2007) conducted on the Universities of Applied Sciences in Austria. The research is to establish an idiosyncrasy model for Universities of Applied Sciences in Austria showing the effects of their idiosyncrasies on the ability to successfully conduct technology transfer. Research applied in the study is centred on qualitative methods as major emphasis is placed on theory building. The study pursues a stepwise approach for the establishment of the idiosyncrasy model. In the first step, an initial technology transfer model and list of idiosyncrasies are established based on a synthesis of findings from secondary research. In the second step, these findings are enhanced by the means of empirical research including problem-centred expert interviews, a focus group and participant observation. In the third step, the idiosyncrasies are matched with the factors conducive for technology transfer and focused interviews have been conducted for this purpose. The findings show that idiosyncrasies of Universities of Applied Sciences have remarkable effects on their technology transfer abilities. This paper presents four of the models that emerge from the PhD research: Generic Technology Transfer Model (Section 5.1); Idiosyncrasies Model for the Austrian Universities of Applied Sciences (Section 5.2); Idiosyncrasies-Technology Transfer Effects Model (Section 5.3); Idiosyncrasies-Technology Transfer Cumulated Effects Model (Section 5.3). The primary and secondary research methods employed for this study are: literature survey, focus groups, participant observation, and interviews. The findings of the research contribute to a conceptual design of a technology transfer system which aims to enhance the higher education institutions' technology transfer performance.
The aim of this paper is to present the findings of a PhD research (Heinzl 2007, Unpublished PhD Thesis) conducted on the Universities of Applied Sciences in Austria. Four of the models that emerge from this research are: Generic Technology Transfer Model (Sect. 5.1); Idiosyncrasies Model for the Austrian Universities of Applied Sciences (Sect. 5.2); Idiosyncrasies-Technology Transfer Effects Model (Sect. 5.3); Idiosyncrasies-Technology Transfer Cumulated Effects Model (Sect. 5.3). The primary and secondary research methods employed for this study are: literature survey, focus groups, participant observation, and interviews. The findings of the research contribute to a conceptual design of a technology transfer system which aims to enhance the higher education institutions' technology transfer performance. © 2012 Springer Science+Business Media, LLC.
Community networks intrinsically rely on being able to deploy large scale projects with an explicit focus on cost effectiveness. As such, they often leverage not only open-source software, but also some proprietary solutions which, although closed source, may not command a licence fee. This chapter briefly discusses an undergraduate project addressing a hardware solution integrating several open-source software projects into a cohesive structure. The platform, dubbed as a “Mobile ISP” – or mISP is a natural extension on the established Wireless ISP concept with a practical bent towards wire-free deployment and gateway connectivity. In addition it justifies a split microarchitecture approach and depicts further usage schemas for the device afforded by virtue of the extensibility it offers.
This book chapter is based on a project entitled ‘Does “thin client” mean “energy efficient”?’(Pattinson and Cross, 2011) funded under the JISC Greening ICT initiative. This project was devised to conduct actual measurements in use in a typical university environment. We identified a test area which was a mixed administrative and academic office location that supported a range of users, and we made a direct replacement of the current thick client systems with thin client equivalents; in addition, we exchanged a number of PCs operating in thin and thick client mode with devices specifically branded as “low power” PCs and measured their power requirements in both thin and thick modes. We measured the energy consumption at each desktop for the duration of our experiments, and also measured the energy draw of the server designated to supporting the thin client setup, giving us the opportunity to determine the power per user of each technology. Our results showed a significant difference in power use between the various candidate technologies, and that a configuration of low power PC in thick client mode returned the lowest power use during our study. We were also aware of other factors surrounding a change such as this: we have addressed the technical issues of implementation and management, and the non-technical or human factors of acceptance and use: all are reported within this document. Finally, our project was necessarily limited to a set of experiments carried out in a particular situation, therefore we use estimation methods to draw wider conclusions and make general observations which should allow others to select appropriate thick or thin client solutions in their situation.
This book chapter is based on a project entitled ‘Measuring Data Centre Efficiency’ (Pattinson and Cross, 2013) funded under the JISC Greening ICT initiative, call for projects 14/10 of October 2010. More specifically, it resides under initiative I of that call, “Rapid Innovation in ICT”. The metering methodology employed for this project fostered metering at varying granularity (e.g. measurement of energy use for individual devices/location of the data center; measurement of energy use for aggregated devices/locations). The project provided real experimental data relating to the accurate measurements of energy consumption in different parts of the data centre. Power Usage Effectiveness (PUE) was calculated based on the collated data and used to provide trends and comparative analysis of the data center efficiency.
Substantial numbers of Indonesian women are seeking employment as domestic workers in Malaysia in order to escape poverty and unemployment and to be able to support their families back home. Most Indonesian domestic workers in Malaysia face unpleasant working conditions with long working hours and no freedom to move or communicate; some find themselves in a situation of abuse. In many cases, the decision to work abroad is made without being properly informed about what to expect. Furthermore, most of the Indonesian migrant domestic workers do not know about process and procedures and are not aware of their rights and the possibilities of seeking assistance when problems occur. In order to empower the target group, relevant information need to be disseminated. Current strategies do not seem to achieve the desired effect. Many of the affected women come from remote areas, are poor and have a low level of education; therefore, their skills to make use of written or even digital information are limited. Appropriate strategies are suggested to utilise traditional and commonly used information dissemination channels such as cultural performances, group discussions and radio. Educational measures should be combined with aspects of local entertainment culture in order to attract attention and to provoke identification with the issues discussed. Further research is necessary to actually develop an appropriate information dissemination strategy with regard to the target group and to evaluate its benefits by conducting pilot projects.
"Sustainability" or ("green-ness") has a significant role to play in the teaching of information systems in higher education. There is considerable variation possible in breadth and depth of content, and raises the question of appropriateness, suitability of material and targeting at an appropriate level. There also exists considerable potential for variation of emphasis in presentation according to students’ motivations. Institutions have adopted different approaches, some utilising specialised courses as components of wider programmes, or in their own right. Others have integrated the topic into undergraduate teaching, perhaps as a component of systems analysis and design courses, treating environmental impact as a design constraint within a solution. Computer science programmes may use their computer architecture-themed modules to introduce the relationship between hardware design and energy use or the ethics and professionalism strand may be developed through consideration of electronic waste or the legal issues around the need for compliance with legislation. This paper reports our study of these variations, and introduces some of the teaching materials we have developed as part of a recent HEA ICS project, along with an introduction to a community site to enable colleagues interested in this topic to share ideas and resources.
Sustainable and Green Information Systems: Preparing the next generation of practitioners.
Societies across the globe are recognising the potential troubles that may arise from environmental change potentially brought about by, or at least accelerated through, the actions and impact of human activity. This raises challenges for the computing industry, which is a large and growing contributor to the environmental impact of societies. The general green issues are realised as green computing within the context of IT. Whilst environmental concerns dominate the media, they are now recognised as part of a complex set of parameters within the wider agenda of considering the long term viability and development of societies. This wider structure is framed through the concept of sustainable development – recognising issues that are entwined with environmental concerns, typically financial, scientific and governance issues. Information systems - with its encompassing remit of IT hardware, software, data, processes, procedures and in particular people - provide an ideal framework through which to assess, plan and implement green strategies. These are relevant to businesses, as well as to government agencies. Such systems are widely expected to play a major role in the enhancement of sustainability across organisations: otherwise referred to as “greening by IT”, or “Green IT 2.0”, in which IT will support the activities and decisions required to operate in a more sustainable manner. The complexities of these issues –with the balance between evidence for change, costs and practicalities, and longer term developments – mean that there is no simple right answer. Moreover, these issues will continue to grow in importance and impact. Current IS systems are limited in their provision for such issues, and training related to this is still in the early stages of development. So building an awareness and understanding of these in the next generation of IS practitioners, with the aim of producing a next generation able to produce systems that take account of sustainable development in its myriad forms is essential. In this paper we review some of the approaches and issues with regards to how to develop sustainable development within IS education, and identify some of the developments required to ensure that IS fulfils its potential to enable societies to deal with the complex and challenging changes that may follow. We address the question of breadth versus depth of coverage, and provide examples of work in progress in delivering these requirements.
Sustainable development and green computing issues are increasingly important for computing professionals. Preparing the next generation of implementers and developers requires that Higher Education providers develop curriculum to reflect this. There are a number of drivers behind this – encompassing industry and professional body demands, policy makers’ directives, as well as institutional commitments to sustainability. Other reasons include the effectiveness of this topic as a way to address particular issues in engagement and recruitment to computing courses. When considering “sustainability" or ("green-ness") in teaching, we should consider the appropriateness and suitability of material, and target it at an appropriate level. There are also choices about how to present the material so as to match students’ motivation, which can reflect gender and other demographic issues. Institutions can adopt different approaches, such as specialised courses as components of wider programmes, or as specialised courses in their own right. Some approaches integrate the topic into undergraduate teaching, treating environmental impact as a design constraint within a solution. This paper reports on some of these variations and directs readers to an online resource to enable colleagues interested in this topic to share ideas and approaches. Whilst the focus is on computing, many of the issues are transferrable to other STEM disciplines.
In the world of cloud computing, millions of people are using cloud computing for the purpose of business, education and socialization. Examples of cloud applications are: Google Drive for storage, Facebook for social networks, etc. Cloud users use the cloud computing infrastructure thinking that these services are easy and safe to use. However, there are security and performance issues to be addressed. This paper discusses how cloud users and cloud providers address performance and security issues. In this research, we have used business process modelling and simulation to explore the performance characteristics and security concerns in the service development life cycle. The results show that Business Process Modelling Notations (BPMN) simulation is effective for the study of cloud security process in detail before actual implementation. The total simulation duration time was 51 days and 9 hours 40 minutes but the results are displayed in 7 seconds only.
The task of reducing the energy footprint of IT devices and software has been a challenge for Green IT research. Monitoring approaches have primarily focused on measuring the energy consumption of the hardware components of computing devices. The use of applications or software on our computer systems consumes energy and it also affects how various hardware components and system resources consume energy. Consequently, running web browsers applications will utilise considerable energy and battery consumption. In this research, we have run different types of experiments which involve the use of several measuring tools. Firsly, a joulemeter is used to monitor (and measure) the power consumed by the hardware and software while running web-based and stand-alone applications on several devices. Additionally, the tablet in-built battery status checker is used to measure the battery consumption when web-based applications are run on the device.
Introduction to Green IT Chapter
This research aims to compare the energy efficiency in between two generations Intel processors; the 4th Gen Intel ® Core ™ Processor and 3rd Gen Intel ® Core ™ Processor. It also surveys the technologies that provide better energy performance for both of the processors. The methodology used for this research is a physical experiment conducted in an Intel production plant. The results obtained from the experiment show that the 4th Gen Intel ® Core ™ Processor is more energy efficient than the 3rd Gen Intel ® Core ™ Processor.
The main goal of this proposed research is to radically transform health services with incrementally evolving organisational change. This is made possible by exploiting advanced cloud-based ICT systems to empower patients to self-manage their own health and wellbeing. The project innovation is not the base technology (which is highly innovative by itself) but the rapid continuous improvement of technology enabled service delivery (i.e. following the PDSA iterative cycles) based on continuous user and technical requirements analyses. A range of ICT-enabled services are provided to all citizens with health records (including long term and short term patients) at greatly reduced costs. They are: a customisable e-health passport which contains all essential health and wellbeing information about the patient; intelligent data analytics and decision support system; clinical team view (with health intelligence facility); hybrid diagnostic expert system for teleconsultation; and online Community of Support.
Legal and Regulatory Framework for Green IT
This paper is an enhanced version of the paper presented at the SEEDS Conference (Olaoluwa, et. al, 2015). The increasing rate of carbon dioxide and other greenhouse gas emission resulting from the use of IT and other human activities to the atmosphere has become a major source of concern. It is imperative for the IT sector to ensure that its products are effective and energy efficient accompanied by mitigated negative impact on the environment. Reducing energy consumption of IT products is a key to contributing towards a greener environment. Another alternative is to produce energy efficient codes for software applications. In programming or scripting languages, an end result can be achieved in more than one way. For example, in PHP, a print command can be executed using a single quote and can also be achieved using a double quote. They have similar functions with similar quality of the intended outcomes. The aim of this research is conduct an investigation on the energy consumption of selected PHP scripts that perform similar functions: print single and double quote; echo single and double quote, etc… The Joulemeter energy measuring tool is used to measure the amount of energy consumed when run the various PHP scripts.
Battery consumption in mobile applications development is a very important aspect and has to be considered by all the developers in their applications. This study will present an analysis of different relevant concepts and parameters that may have an impact on energy consumption of Windows Phone applications. This operating system was chosen because limited research related thereto has been conducted, even though there are related studies for Android and iOS operating systems. Furthermore, another reason is the increasing number of Windows Phone users. The objective of this research is to categorise the energy consumption parameters (e.g. use of one thread or several threads for the same output). The result for each group of experiments will be analysed and a rule will be derived. The set of derived rules will serve as a guide for developers who intend to develop energy efficient Windows Phone applications. For each experiment, one application is created for each concept and the results are presented in two ways; a table and a chart. The table presents the duration of the experiment, the battery consumed in the experiment, the expected battery lifetime, and the energy consumption, while the charts display the energy distribution based on the main threads: UI thread, application thread, and network thread.
With small and medium sized-enterprises (SMEs) taking up the majority of the global businesses, it is important they act in an environmentally responsible manner. Environmental management systems (EMS) help companies evaluate and improve their environmental impact but they often require human, financial, and temporary resources that not all SMEs can afford. This research encompasses interviews with representatives of two small enterprises in Germany to provide insights into their understanding, and knowledge of an EMS and how they perceive their responsibility towards the environment. Furthermore, it presents a toolkit created especially for small and medium-sized enterprises. It serves as a simplified version of an EMS based on the ISO 14001 standard and is evaluated by target users and appropriate representatives. Some of the findings are: while open to the idea of improving their environmental impact, SMEs do not always feel it is their responsibility to do so; they seem to lack the means to fully implement an EMS. The developed toolkit is considered useful and usable and recommendations are drawn for its future enhancement.
The increasing rate of carbon and other greenhouse gas emission resulting from the use of IT and other human activities to the atmosphere has become a major source of concern. It has become a matter of great importance for the IT sector to put its house in order by ensuring that its products are effective and efficient, and with little or no negative impact to the environment. Effective and efficient products perform all the intended purposes with reduced consumption of energy resources, and thereby having reduced impact on the environment. Reducing energy consumption of IT products is a key to contributing towards a greener environment. In programming or scripting languages, an end result can be achieved in more than one way. For example, in PHP, a print command can be executed using a single quote and can also be achieved using a double quote, with both achieving the same end-results and without affecting the quality of the intended outcomes. This has led to the research on the energy consumption of selected PHP scripts that perform similar functions: print single and double quote; echo single and double quote, etc… The Joulemeter energy measuring tool is used to measure the amount of energy consumed when run the various PHP scripts.
A Simulation Environment for investigating In-Flight Startle in General Aviation
Loss of control in-flight (LOC-I), precipitated by a loss of situational awareness and the presence of startled responses has been identified as a leading cause of aviation-based fatalities in recent decades. This has led to significant effort toward improving safety records, particularly in the fields of flight crew training and in-flight support technologies that aid better decision-making, as well as for training the management of reactions to a startling occurrence. One way to achieve quality decision-making in the cockpit is by providing adequate cueing and response activating mechanisms carefully designed to aid human information processing. Furthermore, these response performances, especially in the context of reactionary management of startle in-flight, could be honed through simulator based training. This paper describes the simulation environment developed as well as its key characteristics which enable such a simulation experiment in the general aviation domain. The flight simulation platform as an invaluable component, in the endeavour to create a viable avenue for investigating unexpected error input is also discussed, as well some key elements of current methods driving research into startle imparted loss of control.
Adoption of Social Media as Communication Channels in Government Agencies
Social media has become an integral part of many people's lives around the world. The main use of this communication channel is to connect with social circles. It is also widely used for commercial and business purposes. Governments are also keen to use social media as an alternative to the traditional communication channels. Nonetheless, when the level of use of social media in the government is compared to other fields, a clear gap becomes apparent. This chapter investigates the adoption of social media as a communication channel between citizens, public agencies and government departments; and considers a wide range of factors that affect the issue from the perspective of public agencies. This chapter presents an extensive literature review and proposes a framework that organises the critical factors that affect public agencies' efforts while implementing social media. We also provide a list of hypotheses to validate and evaluate the significance of these factors.
Performance monitoring of Virtual Machines (VMs) of type I and II hypervisors with SNMPv3
Simple Network Management Protocol (SNMP) is the protocol which has the capability to monitor the performance of IP based devices. Also it can monitor components installed on these devices to determine whether they are working or not. In an environment with hundreds of installed servers, it is not possible to check if each machine is working properly. SNMP can also be a used in cloud computing environment for the monitoring of the hardware, infrastructure or virtual machines. To date, not much work has been done on cloud computing monitoring with SNMP. In this research, empirical results of using SNMPv3 to monitor Virtual Machines configured on Type I and II hypervisors and the hypervisor itself vmware ESXi 5.5 resources. Additionally, this research involved the exploration of efficient ways for monitoring the resources and this will lead to the customisation of MIB with Agent X to provide some management features using SNMPv3. The completed research work will consists of four components: (1). Network Management System (NMS), where Zabbix will be used as NMS; (2). SNMP Agent, which will be installed on the devices which need to be monitored; (3). Virtual Machines; (4). Type I and II Hypervisor.
Introducing Controlling Features in Cloud Environment by Using SNMP
Simple Network Management Protocol (SNMP) is an application-layer protocol that is used to monitor IP based devices. Also it can monitor and manage IP based devices without impacting their performance. SNMP is also used in a cloud computing environment to monitor and control virtual machines. However, currently, there is limited published research work on the deployment of SNMP for monitoring and controlling virtual machines in such type of environment. This would allow load balancing in cloud environment during peak hours leading to reduced power consumption helping in Green IT cause. This paper discusses the deployment SNMP for monitoring and controlling Type 1 hypervisor (in a cloud environment). This is followed by the customization of MIB and net-snmp with Agent X to provide more SNMP management features. The completed research work will provide a rigorous physical experimentation involving SNMP monitoring and management for Type I hypervisor Xen.
Managing Energy Efficiency in the Cloud Computing Environment Using SNMPv3: A Quantitative Analysis of Processing and Power Usage
In Europe, ICT equipment and services account for 2.5%-4% for EU's carbon emissions. According to the Smart2020 report by the Global e-Sustainability Initiative, GeSI (2008), the ICT sector's emissions are expected to increase, from 0.53 billion tonnes (Gt) carbon dioxide equivalent (CO
© The Authors, 2016.One of the Web 2.0 technologies that has increasingly been adopted and integrated into governmental agencies is social media (SM), in order to enhance their performance. This technology is regarded as an important and powerful tool due to its ability to enable two-way communication. Despite this fact, the figures for public participation in government 2.0 are still quite low, especially in most developing states, and Saudi Arabia is no exception. The available literature on this subject suggests extensive utilisation of Web 2.0 tools and SM by the private sector, while pointing to a quite contrary scenario in the public sector. The findings of this study indicate a lack of research regarding the factors affecting the implementation of SM as part of the e-government structure in the public sector. The objective of this research is to suggest a conceptual model based on the Technology-Organisation-Environment (TOE) model from a government perspective that will bridge the gap between the present e-governments and full-blown SM adoption, with particular emphasis on Saudi Arabia's e-government. The model introduced in this research provides a coherent framework for additional practical investigation into the use of SM. Using the qualitative method for data collection and analysis, particularly the case study approach, a study of 13 Saudi Arabian ministries was carried out through semi-structured interviews with key employees responsible for SM adoption and implementation in these ministries. The results indicate that some of these factors were observed in the Saudi environment, whereas others were less noticeable. The outcome of this research could also help the leaders in government to identify the best practices for achieving the full potential of implementing SM in government.
This study involves an investigation on the Green ICT strategy of a financial organization. The baseline for the Green ICT strategy implementation is elicited via a semi-structured interview and assessed using a bespoke tool developed for a SURF Maturity Model driven framework. This framework encompasses Green ICT strategy, Greening of ICT in the organization and Greening of operations in ICT. The results of the study reveal that the overall baseline score is 1.8 out of 5.0 which is a relatively low score. However, the overall target level set for organization is 3.0 out of 5.0 accompanied by a roadmap and action plan (with several key action objectives) that covers a 5-year timeframe to bridge the gap between the baseline and the target. An IT representation from the organization provides some feedback on the action plan that leads to several amendments relating to cloud technology and a written business case for promoting a Green ICT strategy.
This study involves an investigation on the Green ICT strategy of a financial organization. The baseline for the Green ICT strategy implementation is elicited via a semi-structured interview and assessed using a bespoke tool developed for a SURF Maturity Model driven framework. This framework encompasses Green ICT strategy, Greening of ICT in the organization and Greening of operations in ICT. The results of the study reveal that the overall baseline score is 1.8 out of 5.0 which is a relatively low score. However, the overall target level set for organization is 3.0 out of 5.0 accompanied by a roadmap and action plan (with several key action objectives) that covers a 5-year timeframe to bridge the gap between the baseline and the target. An IT representation from the organization provides some feedback on the action plan that leads to several amendments relating to cloud technology and a written business case for promoting a Green ICT strategy.
Studies have shown that mixed resolution based video codecs, also known as asymmetric spatial inter/intra view video codecs are successful in efficiently coding videos for low bitrate trans-mission. In this paper a HEVC based spatial resolution scaling type of mixed resolution coding model for frame interleaved multiview videos is presented. The proposed codec is designed such that the information in intermediate frames of the center and neighboring views are down-sampled, while the frames still retaining the original size. The codec’s reference frames structure is designed to efficiently encode frame interleaved multi-view videos using a HEVC based mixed resolution codec. The multi-view test video sequences were coded using the proposed codec and the standard MV-HEVC. Results show that the pro-posed codec gives significantly higher coding performance over the MV- HEVC codec at low bitrates.
Social media has become an integral part of many people’s lives around the world. The main use of this communication channel is to connect with social circles. Social media is also widely used for commercial and business purposes. Governments are also attempting to use social media to communicate with their citizens as an alternative to the traditional communication channels. Nonetheless, when the level of use of social media in the government is compared to other fields, e.g. social and commercial sectors, a clear gap becomes apparent. This chapter investigates the issue of adopting social media as a communication channel between citizens and public agencies (including government departments) and considers a wide range of factors that affect the issue from the perspective of public agencies. The purpose of the chapter is to provide an extensive literature review and propose a framework that organises the critical factors that affect public agencies’ efforts while implementing social media and to provide a list of hypotheses to validate and evaluate the significance of these factors. The validation and evaluation of critical factors will hopefully help the decision makers to better understand the concerns and challenges, and thereby enabling them to concentrate on efforts to propose the best practices to overcome the barriers to adopting social media.
The field of cloud computing has witnessed tremendous progress, with commercial cloud providers offering powerful distributed infrastructures to small and medium enterprises (SMEs) through their revolutionary pay-as-you-go model. Simultaneously, the rise of containers has empowered virtualisation, providing orchestration technologies for the deployment and management of large-scale distributed systems across different geolocations and providers. Big data is another research area which has developed at an extraordinary pace as industries endeavour to discover innovative and effective ways of processing large volumes of structured, semi-structured, and unstructured data. This research aims to integrate the latest advances within the fields of cloud computing, virtualisation, and big data for a systematic approach to stream processing. The novel contributions of this research are: (1) MC-BDP, a reference architecture for big data stream processing in a containerised, multi-cloud environment; (2) a case study conducted with the Estates and Sustainability departments at Leeds Beckett University to evaluate an MC-BDP prototype within the context of energy efficiency for smart buildings. The study found that MC-BDP is scalable and fault-tolerant across cloud environments, key attributes for SMEs managing resources under budgetary constraints. Additionally, our experiments on technology agnosticism and container co-location provide new insights into resource utilisation, cost implications, and optimal deployment strategies in cloud-based big data streaming, offering valuable guidelines for practitioners in the field.
There has been increasing demand for multiview video transmission over band limited channel over past years and various techniques have been proposed to fulfil this need. In this paper, a High Efficiency Video Codec (HEVC) based spatial resolution scaling type of mixed resolution coding model, MRHEVC-MVC, for frame interleaved multiview videos is presented. However, enabling the HEVC to encode video with different frame resolutions is a challenge due to the coding tree partitioning used by the codec. This has been overcome by super-imposing the low resolution replica of each full resolution frame on their respective decoded picture buffer and setting the remaining space of the frame buffer to zero. The codec’s reference frames structure is designed to efficiently encode frame interleaved multiview videos using a HEVC based mixed resolution codec. The proposed MRHEVC-MVC codec has been tested against the standard multiview extension of high efficiency video codec (MV-HEVC) for “Balloon”, “Newspaper1”, “Undo_Dancer”, “Kendo” and ““Poznan_Street” standard multiview video sequences. Results show that the proposed codec gives significantly higher coding performance to that of the MV-HEVC codec at low bitrate both subjectively and objectively.
This paper presents a High Efficiency Video Codec (HEVC) based spatial mixed-resolution stereo video codec. The proposed codec applies a frame interleaving algorithm to reorder the stereo video frames into a monoscopic video. The challenge for mixed-resolution video coding is to enable the codec to encode frames with different frame resolutions. This issue is addressed by superimposing a low resolution replica of the decoded I-frame on its respective decoded picture, where remaining space of the frame is set to zero. This significantly reduces the computation cost for finding the best match. The proposed codec’s reference frames structure is designed to efficiently exploit both temporal and inter-view correlations. Performance of the proposed codec is assessed using five standard multiview video datasets and benchmarked against that of the anchor and the state-of-the-art techniques. Results show that the proposed codec yields significantly higher coding performance compared to the anchor and state-of-the-art techniques.
In order to accommodate latency-sensitive IoT and AI workloads, serverless computing is becoming more popular in edge environment. However, the default Kubernetes scheduler ignores the energy and performance limitations of edge nodes and is resource-agnostic. Prior approaches usually only optimized for latency or energy, ignoring the combined effects of cold-start dynamics, inter-node communication, and inter-service dependencies. In this work, we propose a lightweight heuristic scheduling approach that combines inter-service traffic, energy, and latency into a single cost function. This approach, implemented as a custom Kubernetes Scheduling Framework plugin, has low overhead and is used in conjunction with a descheduler that consolidates workloads by draining underutilized nodes. Short-term responsive placements and long-term energy efficiency are made possible by this combination. We test the system on a Raspberry Pi cluster, using Knative workloads that are typical of IoT analytics workflows. The average latency decreased by 29%, failure rates decreased by 74%, and energy consumption per request reduced by 32%, all of which are consistent improvements over the default scheduler. These results show that multi-objective, metrics-aware placement can significantly improve serverless edge platforms’ quality of service objectives and energy efficiency, specifically when combined with descheduling for consolidation.
© 2018 Elsevier Ltd A real PV array combined with two storage solutions (B, battery, and H, hydrogen reservoir with electrolyzer-fuel cells) is modeled in two geolocations: Oxford, UK, and San Diego, California. All systems meet the same 1-year, real domestic demand. Systems are first configured as standalone (SA) and then as Grid-connected (GC), receiving 50% of the yearly-integrated demand. H and PV are dynamically sized as function of geolocation, battery size B M and H's round-trip efficiency η H . For a reference system with battery capacity B M =10 kW h and η H =0.4, the required H capacity in the SA case is ∼1230 kW h in Oxford and ∼750 kW h in San Diego (respectively, ∼830 kW h and ∼600 kW h in the GC case). Related array sizes are 93% and 51% of the reference 8 kW p system (51% and 28% for GC systems). A trade-off between PV size and battery capacity exists: the former grows significantly as the latter shrinks below 10 kW h, while is insensitive for B M rising above it. Such a capacity achieves timescales’ separation: B, costly and efficient, is mainly used for frequent transactions (daily periodicity or less); cheap, inefficient H for seasonal storage instead. With current PV and B costs, the SA reference system in San Diego can stay within 2·10 4 $ CapEx if H's cost does not exceed ∼7 $/kW h; this figure increases to 15 $/kWh with Grid constantly/randomly supplying a half of yearly energy (6.5 $/kWh in Oxford, where no SA system is found below 2·10 4 $ CapEx). Rescaling San Diego's array (further from its optimal configuration than Oxford's) to the ratio between local, global horizontal irradiance (GHI) and Oxford GHI, yields in all cases a 11% reduction of size and corresponding cost, with the other model outputs unaffected. The location dependent results vary to different extents when extending the modeled timeframe to 18 years. In any case, the variability stays within ±10% of the reference year.
Development of efficient video codecs for low bitrate video transmission with higher compression efficiency has been an active area of research over past years and various techniques have been proposed to fulfil this need. In this paper, a mixed resolution based video codec for low bitrate transmission within the standard HEVC codec’s framework is proposed. A spatial resolution scaling type of mixed resolution coding model for monoscopic videos using HEVC codec is presented. The proposed mixed-resolution structure and reference frames structure simplifies the implementation of a mixed-resolution based HEVC codec that can code video frames with different resolutions. In order to evaluate the performance of the proposed codec; three 4:2:0 format test video sequences, namely “Cactus”, "KristenAndSara" and “ParkScene”, were selected and coded using the proposed codec and the standard HEVC codec. Experimental results show that the proposed mixed resolution based HEVC codec gives a significantly higher coding performance to that of the standard HEVC codec at low bitrates.
Dependable IoT for Human and Industry Modeling, Architecting, Implementation
In the midst of these, this work has several unique characteristics which should change the reader's perspective, and in particular, provide a more profound understanding of the impact of the IoT on society.
Assessment of the potential use of grid portal features in e‐government
Purpose The purpose of this paper is to overview key features of grid portals and e-government portals and assess the potential for using features of the former in the latter. In the context of this paper, grid portals are defined as graphical user interfaces that a user employs to interact with one or more grid infrastructural resources. Design/methodology/approach The paper classifies grid portals in five categories and two development frameworks and based on this classification overviews ten existing grid portals. The overview covers, where possible, the developers, the objective, the implementation, and the features of the considered grid portals. For e-government, the paper focuses on the overview of a typical e-government portal and best design practices. Based on the overview of grid portals and the typical e-government portal, the paper assesses the potential benefit of grid portals in meeting the critical success factors for e-government identified as: integration, knowledge management, personalization, and customer engagement. The results are tabulated, analysed, and discussed. Findings Many of the features of existing grid portals have the potential to be used within an e-government portal, but the lack of any in-depth study of the nature of the e-government application domain (from a technical and social perspective) in-line with grid development makes this potential far from reachable at this stage. This is disappointing but does highlight opportunities. Practical implications This paper motivates a greater in-depth analysis and study of the potential use of the grid for e-government. The grid infrastructure promises solutions to various applications domains including e-government. Originality/value This paper explored the potential of a technology infrastructure for e-government. This exploration is based on a novel dual overview and evaluation of the technology and the application domain. The paper can be a basis and a reference for further research in different areas including, among others: technology infrastructures for e-government, grid development for various application domains, benchmarking of grid utility and usability for various application domains, grid gateways, and emerging technologies to meet the critical success factors for e-government. © 2008, Emerald Group Publishing Limited
e‐Government in Jordan: challenges and opportunities
The purpose of this paper is to examine the challenges encountered in e‐government implementation, as well as the potential opportunities available in the context of Jordanian society. A detailed examination and analysis of Jordan's published e‐government vision and strategy is presented, together with a review of other relevant literature. The findings and implications of this study reveal Jordan is still lagging behind in utilising information and communication technologies for delivering government services online. An understanding of the current status of e‐government in Jordan can help policy makers in the country pursue development of the public sector organisations on the one hand, and would be of importance for Jordan's economic future success on the other. This is believed to be the most up‐to‐date and comprehensive analysis of Jordan's plans and assessment of its level of readiness for delivery of e‐government services.
Observations of the Scottish elections 2007
The purpose of this paper is to provide an observational examination of the recent Scottish elections, within which an e‐counting system was employed to manage the increased complexity of the Scottish electoral system for the first time. Observations of an ethnographic nature, supplemented by written documentation used for both training and public consumption during the Scottish election process. It was found that the voting system for the Scottish elections had not received sufficient review or testing prior to the election; further that the design choices imposed by the DRS software did not support the actions of its users efficiently enough, or justify confidence in the dependability of the system. That the deployment of e‐counting systems requires careful consideration; many of the issues raised in this paper are similar to those of the official Scottish Elections Review, to which our team provided input. The Scottish elections were the first to allow members of the public to register as election observers, accredited by the Electoral Commission. As such, the Scottish elections represented the first large‐scale opportunity to observe such processes for the academic community.
Stakeholder enfranchisement
The purpose of this paper is to clearly identify a problem area in participation and indicate the potential for technology and eParticipation research as a response to the problem. The approach taken within the paper is to develop a hypothetical argument illustrating deficiencies in the current UK approach to public participation and the use of expert evidence in consultation processes. The argument is developed using traffic analysis as an exemplar of such expert input to planning enquiries. The literature indicates the current confrontation needs to be replaced by a process that supports informative communication and learning; treats citizens fairly and empowers them to have genuine impact on the decisions. Based on the hypothesis that the capability to re‐organise and present the same data in different forms and contexts enables information technology (IT) to bridge gap between different stakeholder groups the paper proposes the development of a collaborative approach to traffic assessment. Such an enhanced process with appropriate IT support – SIRTASS – will enable planning activities achieve better decisions with greater community and citizen acceptance. If applied as a general approach there is the potential to significantly improve the speed and quality of the current UK system. This paper is part of the debate about lack of participation and points to a particular area where research could make a significant contribution.
Current teaching
- MSc Green Computing modules: ICT and Environment, Green Computing Technologies, Responsibly Green
- Masters Degree Modules: Digital Media Communications, Research Practice, Dissertations
- Level 6: AI in Business, Research Project, Project Implementation and Evaluation
- Level 4: Web Development
Featured Research Projects
Application of artificial intelligence to improve the safety of nuclear power plants
Improving nuclear power plants' availability, reducing the cost of operation, helping the decision-making in the nuclear control room, and assisting the preparedness of different accident levels
{"nodes": [{"id": "6513","name": "Professor Ah-Lian Kor","jobtitle": "Professor","profileimage": "/-/media/images/staff/lbu-approved/beec/ah-lian-kor.jpg","profilelink": "/staff/professor-ah-lian-kor/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "145","numberofcollaborations": "145"},{"id": "19660","name": "Dr Akbar Sheikh Akbari","jobtitle": "Reader","profileimage": "/-/media/images/staff/lbu-approved/beec/akbar-sheikh-akbari.jpg","profilelink": "/staff/dr-akbar-sheikh-akbari/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "141","numberofcollaborations": "5"},{"id": "31822","name": "Dr Oleg Illiashenko","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-oleg-illiashenko.jpg","profilelink": "/staff/dr-oleg-illiashenko/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "73","numberofcollaborations": "1"},{"id": "21809","name": "Dr Anatoliy Gorbenko","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-anatoliy-gorbenko.jpg","profilelink": "/staff/dr-anatoliy-gorbenko/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "67","numberofcollaborations": "1"},{"id": "25126","name": "Dr Nawar Jawad","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/dr-nawar-jawad/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "36","numberofcollaborations": "1"},{"id": "29159","name": "Dr Satish Kumar","jobtitle": "Lecturer","profileimage": "/-/media/images/staff/dr-satish-kumar.jpg","profilelink": "/staff/dr-satish-kumar/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "10","numberofcollaborations": "1"},{"id": "3785","name": "Dr Duncan Mullier","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-duncan-mullier.jpg","profilelink": "/staff/dr-duncan-mullier/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "8","numberofcollaborations": "1"},{"id": "6605","name": "Lesley Earle","jobtitle": "Part-Time Lecturer","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/lesley-earle/","department": "School of Built Environment, Engineering and Computing","numberofpublications": "3","numberofcollaborations": "2"}],"links": [{"source": "6513","target": "19660"},{"source": "6513","target": "31822"},{"source": "6513","target": "21809"},{"source": "6513","target": "25126"},{"source": "6513","target": "29159"},{"source": "6513","target": "3785"},{"source": "6513","target": "6605"}]}
Professor Ah-Lian Kor
6513

