Location: The Octagon, The University of Sheffield, Western Bank, Sheffield, S10 2TN
Date: 13th July 2015 9:30pm – 4pm
Now in its 5th year, Innovations in Healthcare aims to build on previous successes, bringing together individuals from across the healthcare sector to see why The University of Sheffield is at the forefront of healthcare research and to discuss potential research collaborations.
Last year saw over 300 delegates register from 160 companies, both large and small, looking to learn more about research at The University of Sheffield through presentations, exhibitions and 1:1 networking, and we are delighted to say that we will be demonstrating at the event. Come down and see our exhibition!
To register click Here
York Computational Immunology Lab Inaugural International Meeting
Computational and Mathematical Approaches to Immunological Challenges
Date: Monday 3rd November – Tuesday 4th November 2014
Venue: Bedern Hall, York
Professor Mike Holcombe will be attending the Computational Immunology Lab on the 3rd – 4th November.
Bringing together world leading researchers to discuss the application of computational and mathematical approaches to assist with key immunological challenges. Delegates will consider how we can maximise the potential of computational and mathematical approaches, the role of these approaches in drug discovery, and current methods tools and techniques available to generate novel biological insights from models. The Wellcome-Trust funded Centre for Chronic Diseases and Disorders at the University of York for funding this meeting. Attendance is by invitation only.
Professor Mike Holcombe has been assisting in the organisation of a conference at the HM Treasury on the afternoon of 13th October 2014.
Professor Holcombe will be providing a brief overview of an economic model and an illustration of how it can be used, with a discussion of the added value provided by the agent-based modelling framework. “Agent Based Modelling for the Economy”, Professor Holcombe has written previous articles such as “Large-scale modeling of economic systems.” (Complex Systems, 22(2), 175-191.) and has had heavy involvement with the EURACE Model.
The Internet of Things (IoT) is the general term that covers a world where smart systems, sensors and a large array of devices are connected together to offer business and consumers a more streamlined experience.
Market Researcher Gartner last year predicted in their Hype Cycle 26 billion devices would be connected by 2020, with Morgan Stanley increasing that figure to 75 billion. Others believe that only 50 million devices will be connected, but these are all estimates and no one really knows how many there will be. Do we really need to put a figure on it? We all know that the world will become more connected. The connectivity value in these networks will provide new ways of thinking, opportunities to exploit and act upon information and disrupt business models as well as introduce new security threats.
Take for instance manufacturing, linking up the IoT, sensors, robotics to automate the manufacturing process now let’s throw in Big Data. Across the world manufacturers are starting to look at linking Big Data within their manufacturing processes and IoT to transform and disrupt their manufacturing processes. Big data within the manufacturing process is becoming a game changer. While companies have been producing data for a years, new data tools are enabling real time analysis that provide real time problem solving, machine health monitoring and costs avoidance. Germany initiated an Industry 4.0 Government initiative to encourage its industry sector to realise the potential of this connection. Real time data driven decision support systems for the factory, that link in with logistic providers, other manufacturers of component parts and their clients, a system capable of synchronising all the areas in the supply chain, providing full visibility to all of those involved. The combination of IoT, and Big Data optimisation is bringing about huge opportunities.
These processes are not just limited to manufacturing, anywhere a supply chain environment exists can benefit from information provided by linked devices and access to Big Data to inform their decision support. The medical profession with environments such as hospitals where patients are their core component, interlinking with multiple chain events in a system of people orientated processes is possible.
Predictive systems play a key role in the Industrial IoT and supply chains. Intel and Mitsubishi piloted automated systems at the Intel manufacturing facility in Malaysia incorporating big data solutions to focus on improving productivity. By incorporating predictive machine health to reduce component failure the pilot optimised the process to realise savings of $9 million during its course.
To realise the vision of IoT in manufacturing interlinking all aspects of the supply chain, from an IT perspective will be a challenge. Many existing companies operate legacy structures which differ greatly from open architectures and data sharing. Linking data from disparate sources can’t be simply merged. To support the IoT, common data models, standards and architecture that spans the supply chain will need to be brought together. In some industries these standards already exist. Industries such as food and drink, have standards to measure the temperature of containers, track shipments and analyse food contamination, all standard parameters across the supply chain.
The impact of using big data and predictive systems in IoT manufacturing will represent great economic value. Optimised and automated factories will achieve results at a faster rate, reduce energy operating costs with increased efficiency, and processes that react to demand. Incorporating real time data will open up markets for companies to be able to response to changing consumer demand quickly, supply markets faster inline with demand. Future visions will include predictive machine health monitoring potentially not just in the manufacturing process, but also enabling streamlined after sales opportunities for companies to supply their consumers with parts before complete failure. Big data, predictive models and IoT will potentially open up greater markets that would benefit all within the supply chain.
When two leading infrastructure and transport companies – Costain and Thameslink wanted to be sure their designs for major London railway stations would maximise pedestrian flow and passenger comfort, they turned to a team of computer scientists at the University of Sheffield.
An extract from a historic article back in August 2013 in the Universities Discover Magazine.
Professor Mike Holcombe talks about our simulation modelling, smart cities and big data in healthcare. We are indeed still working alongside infrastructure companies to model the behaviour of crowds within environments, enabling designers and decision makers to look at their ideas to help reduce the risk of unwanted situations occurring.
On the 10th April NVIDIA’s Calisa Cole interviewed Dr Paul Richmond on Accelerated Agent-Based Simulation of Complex Systems
NVIDIA: Paul, tell us about the FLAME GPU software which you developed.
Paul: Agent-Based Simulation is a powerful technique used to assess and predict group behavior from a number of simple interacting rules between communicating autonomous individuals (agents). Individuals typically represent some biological entity such as a molecule, cell or organism and can therefore be used to simulate systems at varying biological scales. The Flexible Large-scale Agent Modelling Environment for the GPU (FLAME GPU) is a piece of software which enables high level descriptions communicating agents to be automatically translated to GPU hardware. With FLAME GPU, simulation performance is enormously increased over traditional agent-based modeling platforms and interactive visualization can easily be achieved. The GPU architecture and the underlying software algorithms are abstracted from users of the FLAME GPU software, ensuring accessibility to users in a wide range of domains and application areas.
NVIDIA: How does FLAME GPU leverage GPU computing?
Paul: Unlike other agent-based simulation frameworks, FLAME GPU is designed from the ground up with parallelism in mind. As such it is possible to ensure that agents and behavior are mapped to the GPU efficiently in a way which minimizes data transfer during simulation. One of the most exciting aspects of GPU-accelerated simulation is that simulations can often be run faster than real-time. For example, a pedestrian evacuation model can be matched to real-world conditions and an evacuation plan can be simulated, essentially looking into the future for potential danger or problems. This forms the basis of my current work into using such agent simulation techniques for prediction and decision making. Example of pedestrian evacuation modeling simulation with FLAME GPU.
NVIDIA: What challenges did you face?
Paul: GPUs are very good at simulating homogeneous groups of agents where behavior is consistent across a population. In homogenous cases, behavior can be executed as kernels with very little divergence, which results in high performance. However, as the complexity of agents within a population increases, so too does the heterogeneity, ultimately impacting performance…….
To read the full article click here