CEBE Upcoming Research Seminars

Wednesday 10th May, 12pm

With Dr Hafiz Alaka (Centre for Integrated Design Construction)

Big Data analytics and predictive analytics for the construction industry

My research focuses on the use of predictive analytics and Big Data Analytics (BDA) in the construction sector. This is being applied in two main areas. The first, which is already being implemented, is to predict the performance of construction firms in terms of potential failure or survival. The second is to predict the properties of fly ash concrete using the physical and chemical properties/composition of the concrete constituents (i.e. cement fly ash, aggregate, among others. This seminar presentation will focus on a study on predicting the performance of construction firms as explained below

In a pioneering effort, the study is the first to develop a construction firms insolvency prediction model (CF-IPM) with BDA; combine qualitative and quantitative variables; advanced artificial intelligence tools such as Random Forest and Bart Machine; and data of all sizes of construction firms (CF), ensuring wide applicability

Using the pragmatism paradigm, interviews, questionnaires and financial ratios were used to establish the necessary variables for predicting the potential failure/survival of construction firms.  was employed to allow the use of mixed methods. Using Cronbach’s alpha and factor analysis, qualitative variables were reduced from over 100 to 13 (Q1 to Q13). Also, 11 financial ratios (i.e. quantitative variables) (R1 and R11) reported by large and medium, small and micro CFs were identified for the sample CFs.

The BDA system was set up with the Amazon Web Services Elastic Compute Cloud using five ‘Instances’ as Hadoop DataNodes and one as NameNode. The NameNode was configured as Spark Master. Eleven variable selection methods and three voting systems were used to select the final seven qualitative and seven quantitative variables, which were used to develop 13 BDA-CF-IPMs. The Decision Tree BDA-CF-IPM was the model of choice in this study because it had high accuracy, low Type I error and transparency.

Wednesday 17th May, 1pm 

With Gavin Smart, Deputy Chief Executive, Chartered Institute of Housing

Housing supply and affordability: New ideas for future strategies

This presentation will consider the dimensions and consequences of the current supply and affordability challenges facing housing in the U.K., and in England in particular. Building on this attempt to describe and understand the nature of the problem the presentation will then explore possible policy changes and practical solutions that could help to more effectively address affordability and drive up housing supply.

Wednesday 24th May, 12pm

With Dr Shadi Basurra (Centre for Cyber Security)

This research seminar reports on a practical process that evaluates retrofit technology for zero carbon performance where calibration outcome is used to quantify uncertainty in building performance prediction before and after retrofit.

This process is performed in two phases. The first phase is to develop and calibrate the model before retrofitting. This model is used to run simulations to design the parameters for retrofit. Moreover, it identifies the most sensitive parameters, and whether or not they are physically observable. In the second phase, we update the model to include all retrofit improvements done to the property and perform further calibration since the model can incorporate further uncertainties caused by retrofit improvements. This allows us to understand if the calibrated model generated before retrofit still applies after retrofit.  The buildings under analysis are semi-detached houses belonging to Birmingham City Council in the UK.

Wireless sensors are used to collect various building performance data, such as internal and external air temperature, solar radiation, gas and electricity consumption are used to calibrate the model before and after the retrofit. For calibration, we use K Nearest Neighbour (KNN) to conduct parameter sensitivity analysis with the aim to fine tune the model and establish one-to-ne relationship between the simulated and actual building performance.

A case  study is  presented  where  the  annual  electricity and gas consumption   predicted   by   jEPlus+EA (uses EnergyPlus as core engine) was within 1% of the actual energy consumption of the buildings. This was achieved after three iterations over the base case model.

Thursday 8th June, 12pm

With Professor Achim Jung (University of Birmingham)

The Church-Turing Thesis

At one level, the Church-Turing Thesis (CTT) is a quite clear and simple statement: All formalisations of the intuitive notion of computability are equally expressive. However, one might want to analyse this a bit more carefully, and also consider the context in which computation takes place. For example, we can consider machines that are connected to other machines, or machines that have a built-in notion of data type. Perhaps surprisingly, in these more refined settings the CTT is no longer valid, in the sense that otherwise perfectly natural computational formalisms are weaker than what one might like to call computable.

In this talk I want to explain this phenomenon and speculate as to why this could be an interesting point when considering the computational possibilities of the brain.

For more information and to book your place at any of the seminars please contact Ian McDonald.