Describe six novel applications of expert systems/neural networks/genetic algorithms (2 of
each technology) in an intelligent building complex.
There are four things we need to know when building a system to solve a particular problem (Rich (1), 1991):
- Define the problem precisely, initial situation and the acceptable final solutions.
- Analyse the problem picking out important features that help decide possible techniques to use
- Isolate the knowledge needed for the task.
- Choose the best problem-solving techniques.
Throughout this document 6 problems have been picked as possible candidates for the application of either Expert Systems, Neural Networks or
Genetic Algorithms. For each, the problem is detailed and an acceptable solution is suggested. In addition, the knowledge required is discussed
and the most appropriate technique is suggested with reasons why.
The Conclusion highlights some other examples and draws together comparisons between the three techniques discussed.
The Appendices contain some additional material that relates to parts of the text.
Classical Expert System Applications
'We need to exploit the knowledge we can glean from people since they are the best known performers of most of the tasks with which we are dealing'
(Rich (2), 1991).
An Expert system is essentially a rule-based system that has a knowledge base obtained from human experts in the particular field that the system is operating.
Fire Exit Indicators
Fire regulations dictate that Fire Exits are clearly marked and that there are clear notices, showing fire exit points, posted around a commercial
building. In addition, there is a requirement for certain people to be designated fire marshals to supervise building egress and head counts.
There is, however, a strong reliance on people to read the notices and remember them, if we are honest we tend to treat them with the same
respect as the life jacket demonstrations given just before take-off on aircraft. We then act in a panic when there is an alarm. Technology
exists that can provide more immediate information to the occupant in case of an emergency.
A Expert System could be designed that provided intelligent signage for local groups to follow. For example, adapted smoke heads and sprinkler
points could use modified devices that included a low power LED panel. Arrows made up of flashing LEDs would indicate the nearest fire exit
away from the fire thus cutting out the risk of people forgetting the best exit in a panic, or the risk of visitors not knowing where to go.
Perhaps a suitable name for this is 'Fast Egress System' or FES!
A substantial amount of information would need to be gathered by the system and sent down the system bus linking the devices. The sort of
information required is detailed below:
- Which location held the initial alarming devices.
- Where are additional alarming devices located.
- All routes through and out of the building.
- PIR feedback indicating where people were in the building.
- Person movement tracking devices that could measure the speed of movement. (Such as a mini-version of 'Traffic Master' that monitors
traffic flow on many of the motorways in the UK) (WWW1).
- Where smoke/fumes may be spreading in the ductwork.
- Smoke/Fume levels in all parts of the building.
Fire experts would be able to provide a set of rules which would indicate which routes were the best to take bearing in mind the above
information available to FES. One rule may be 'IF initial alarming device is near Fire Exit 1 THEN point towards Fire Exit 2' or 'IF the
way to Fire Exit 3 is slower than Fire Exit 4 THEN 'point towards Fire Exit 1,2 or 4'.
Such a system would be fluid, in that as the fire changed it's course then the FES would re-evaluate it's direction arrows accordingly to
give the best routes out for people in particular localities. Jams would be less likely as people are diverted from relatively more congested
areas, depending on safety considerations and the flow of people out of the building is likely to be more efficient.
Other benefits would include a psychological benefit in that the panic question of 'which is the best way to go out?' is answered. The system
reassures you that it is aiming to get you out of the building as quickly as possible and away from the fire or bomb threat. You know longer
have to guess which way to go and just follow the crowd since the 'building expert' knows the way.
A curse of our modern commercial culture is the ever changing computer requirements, both in hardware as processor speeds increase, bus
types for graphics and memory improve, sound/video becoming requirements rather than luxuries; and software as so-called 'improved'
revisions force the working populace to walk the upgrade treadmill in order to remain compatible with others in their business dealings.
Rather than fight this inexorable onslaught of 'bloatware', there could be a way to manage the problem.
Amongst the masses there are always those experts who have an interest in keeping their finger on the pulse with regards to the latest
computer technologies. A rule-base could be set up to handle an existing inventory of PCs, for example, where the problem is most apparent!
The IT/FM departments could use such a system that takes data from the asset register such as PC information held by individuals, and
compare it to department requirements and company standards. Using the acquired knowledge from PC experts, upgrade requirements
could be flagged before complaints ensue. This can be based on rules such as software memory requirements, networking requirements etc.
One rule could be 'IF Windows 95 THEN 32Mb RAM' or 'IF AutoCAD AND Windows NT THEN 64Mb AND minimum screen size of 19"'.
The sort of information held would include:
- Hard Disk size and type.
- Memory size and type.
- CPU type.
- Graphics card type
- Bus types(PCI/ISA/AGP etc.)
- Software registered to be used and current revisions/Service Packs etc.
- Network card type.
- Screen size and Type.
- Peripheral details such as access to printer types, scanners etc.
- Current software used and versions.
Due to the nature of the computer industry, the Knowledge Base could very quickly grow out of date unless regular updates were carried out.
This would apply to both the 'internal' information on the users' systems and the 'external' information pertaining to software and hardware
revisions. The use of Internet 'Information Push' technology is ideal for feeding latest software revisions, bug-fixes and service packs to a
database that can be searched and specific relevant information added to the Knowledge Base.
It is in cases like these that the importance (and difficulties!) of Knowledge Representation become more apparent. A good Knowledge
Representation System should have (Rich (3), 1991):
- Representational Adequacy - all kinds of knowledge can be catered for.
- Inferential Adequacy - ability to manipulated the representational structures so as to derive new structures corresponding to new
knowledge inferred from the old.
- Inferential Efficiency - ability to incorporate additional information.
- Acquisitional Efficiency - ability to acquire new knowledge easily.
Some advantages of such an Expert System for computer hardware/software:
- Minimise call outs by the help-desk as many problems can be pre-empted
- Aid departments to accurately budget for their IT requirements in the near future.
- A further aid to preventing software piracy.
- Minimise the 'Status symbol effect' as expensive, high-spec machines are re-distributed to those that need it.
- Revision standards are easier to maintain throughout the company (e.g. instead of a mixture of Word 6.0, Word 95 and Word 97 being used,
a standard of one version can be adhered to)
Neural Network Applications
A Neural Network consists of nodes and links between the nodes that carry numerical values or weights. Each node adds up the inputs on the
links and outputs a relative value between 1 and -1. Adjusting the weights throughout the Neural Network determines the final outputs and is
the way in which the Neural Network is trained. Training generally consists of feeding information into the input nodes, comparing the outputs
from the output nodes with expected outputs and then adjusting the weights throughout the Neural Net until an average error of less than 0.3 is
achieved when compared to the expected results.
Neural Networks come in single-layer (where each node has a link to every other node) or multi-layer (where there are 'hidden' layers in addition
to the normal input and output layers). The most commonly used is the Back-propagation Neural Network (generalisation of the Widrow-Hoff rule
for multi-layer networks) (Davalo (1), 1991). See Appendix A for brief description of the Back-propagation Neural Network.
The Neural Network is ideal for Pattern recognition.
In order to bring some control into the security industry so that Police are not being called out unnecessarily an authorisation body called
NACOSS exists. This covers installation practices and the installers themselves carry the NACOSS approval. NACOSS installed systems
are then allowed to have direct lines to the Police. Although this is brings a measure of 'filtering' with respect to installation quality it does
nothing to minimise false alarms that occur due to animals, or employees working late etc.
A Neural Network could be developed that learned patterns of occupation and could make decisions as to what constituted an illegal entry
and what did not. Such a system would then inform the security guard or the security system, whether or not it was worth involving the police.
The traditional method of physical investigation would still remain as an option.
There are already in existence systems that gather information to a central point, CCTV cameras, Passive Infra-Red detectors, motion sensors,
security lighting (visible and infra-red), alarm triggers in various guises etc. These inputs are commonly brought to a central console with a
multitude of screens and alarm panels. These traditional systems could be augmented with information gathered from other non-specific
security related systems such as lighting, PC microphones (picking up breakage's perhaps), tagged equipment, the movement of which can be
tracked within a building and existing network monitoring tools can tell if equipment is switched off or tampered with or being used in any
way. There are 'intelligent' lighting systems in existence that can operate in 'intruder mode' such that if 'unusual' movement is detected, path
can lit leading the security guard to the position of intrusion.
The addition of a properly trained neural net could result in better decisions being made from all this information. For example, an employee
may come back to work late one evening and enter a particular building via a back entrance to pick up their PC in order to work at home. A
traditional alarm system may have been set for 'no entry by anybody' at that late hour and immediate alarms would be raised unnecessarily.
A neural net may have information that allows it to decide not to sound the alarms since it has been trained to realise that a particular persons
access card has been detected and the PC being moved is assigned to that employee and is therefore legitimate. In this instance no form filling
exercise has been necessary to remove the equipment, no police were called out, the security guard can rest assured that the system considers
the activity to be legitimate and the employee has complete freedom to get on with the job in hand.
Visitor at the gate
It is not unusual for commercial buildings to have some access control on their main gates for the ingress and egress of employees, customers
and suppliers. A system could be developed that took details from the initial contact and passes could begin to be prepared along with
refreshments, the host and perhaps some music or welcoming video could be presented that was tailored to the visitor. All of this could
happen automatically without human involvement, although a human receptionist could still be present to give a personal welcome and hand
over the pass and coffee.
The system could have the following information as inputs:
- Vehicle registration.
- Video capture of the face.
- Personal details obtained from computerised interface with the visitor at the gate.
- Name of visitor.
- Host contact.
- Reason for visit.
- Company being represented.
- Database of existing and potential clients.
- Database of suppliers.
- Database of previous visitors.
A Neural Network could be trained to make decisions based on these inputs.
One decision may be that the visitor is delivering materials to Goods Inwards, so a pass may not need to be made, a coffee may not be required.
In this instance, the people manning the Goods Inwards department need to be informed of the imminent delivery and the visitor can be directed
to Goods Inwards by the computerised interface.
On the other hand, the visitor may be a customer, the represented company could be an existing or potential client and the host contact may be
working on projects related to the visitor's business. In this case, related video information may be presented in the reception area as a
background, regular visitors may have a coffee or tea already brewed as they like it. An intelligent computerised voice system can dial the
host as soon as the visitor has been identified at the gate, thus minimising waiting time for the visitor.
Genetic Algorithm System Applications
The Genetic Algorithm works by randomly generating solutions to a particular problem and then performing iterations, commonly using bit
patterns as representations of solutions. Each iteration selecting a good solution (by way of the 'fitness' of a particular solution), performing
crossover breeding with other solutions within the population of solutions and occasionally performing mutations on individual solutions
(Munakata (1), 1998). The Genetic Algorithm works with a population of solutions and during the iterations it is continually updating the
population that it is working with, throwing away the less good solutions and replacing them with improved solutions (Corne (1), 1998).
The Genetic Algorithm is ideal for finding optimal solutions amongst a whole raft of them in a problem which has many variables. The
following two examples suit the criteria to benefit from the use of Genetic Algorithms.
Stationary Ordering in a Commercial Building
Almost any commercial building, be it a bank, hospital, solicitors firm or a main contractor, requires a substantial amount of stationary and other
regular supplies. It is very common for individual departments to own the responsibility of keeping their supplies topped up.
It is not unusual to see large stocks of forms, pads, pens, pencils etc. that take up cupboard space ordered 'just in case', plus obscure items
that never get used or go out of date, whilst commonly used items frequently run out requiring special emergency orders that do not come
with bulk order discounts. These large stocks are often multiplied department by department throughout the building with little co-ordination
There are however a number of issues that could be addressed by use of a system that employed a Genetic Algorithm. Initially, centrally
organising the stationary orders will save money on labour, enable more bulk order buying, thereby reducing material costs, and give more
chance to spread the load of sudden changes in one department across other departments. Sudden changes could include departments
closing down, or starting up, departments changing their systems biasing the materials used in some way. Other sudden change scenarios
could include late deliveries, change of material design or the introduction of a new style of stationary. With different departments
having different and constantly changing demands on stationary there is the opportunity for minimising the costs by developing a Genetic
Algorithm with a 'Just-in-time' approach to ordering.
The cost balance has to be between not ordering too much (thereby forfeiting interest on money spent), and not ordering too little so
that workers cost the company money as they struggle to operate without proper equipment and the company spends more money filling
in the gaps and so negating benefits gained by ordering in bulk.
Such a Genetic Algorithm could be applied to each department in turn since they would perhaps operate differently. The algorithm
could look at the monthly stationary orders and, perhaps from experience, some solutions as to the amounts required could be drawn
up and used as an initial population.
Appendix B shows a simple worked example of a way in which a Genetic Algorithm could be used for this Stationary Problem.
Food Catering in a Commercial Building
Through the ages eating establishments have had to deal with the issue of wasted food. Not only does the wasted food cost money, but
the time to prepare it and the energy used in cooking it costs as well.
In the competitive modern commercial environment the costs that could be saved by being more efficient on food wastage could mean the
difference between making profit or not.
There are opportunities within a commercial office building, for instance, where the building restaurant could use a genetic algorithm to
minimise the food wastage and energy used in cooking. Many buildings have quite sophisticated access control systems that give a level of
detail at least on who is in occupation (necessary for fire regulations!). Occupation of departments could be analysed and known regular
users of the restaurant could be checked for occupancy in order to aid in minimising how much food is cooked and therefore how much
energy is used. In addition, known food preferences can be analysed over time and the system could be linked to individual diaries and
the food prepared could include preferences on sandwiches, crisps, drinks and confectionery.
The inputs to the algorithm would include details such as individual names, each with a list of top ten favourite main courses, sweets,
drinks, normal time/day of eating and perhaps even details such as whether individuals go out to lunch on certain days.
The check on just finding the one optimum favourite menu and just using that with no other choice is that it is unlikely that the cash flow
will be at it's optimum since there will always be some occupants who dislike the choice in addition to those that used to like the choice
gradually becoming sick of it. The aim therefore, is to find a number of solutions that fulfil a certain cash criteria (say 10 menu sets).
The process is similar to the Stationary Problem described earlier, in that an initial population of random menu selections are created,
perhaps by using a binary notation to indicate whether a particular menu item fulfils a cash criteria allowing it to become part of a 'winning team'
of menus. The 'fitness' function must be carefully thought out to include a broad range of possibilities, for instance, one menu may be more
popular if placed with one thing than if it were placed with another item (e.g. rice could be seen as less favourable with roast lamb; but more
favourable with Lamb Korma).
Some other examples that come to mind are listed below:
- The FES could use a Genetic Algorithm to optimise traffic flow perhaps.
- Weather data information from the Meteorological Office Experts could be fed into building control systems throughout the regions
and have an influence on local environmental control.
- Building Access could be made easier by well used routes being made available without having to swipe the card readers. Patterns
of movement could be examined for individual users and doors could be unlocked in advance of people using them. This could be
combined as each individual moves around the building. One 'strange' movement would be enough to keep a door locked, and the
new movement learned by the system.
- Optimisation of desk space in a teleworking environment would be an ideal candidate for a Genetic Algorithm.
- CCTV could learn where to look optimise before zooming in, perhaps a hybrid between neural net (to decide whether certain
action is worth investigating) and a Genetic Algorithm to identify individuals.
Although we have looked at specific applications using specific techniques, there is no reason why aspects of a certain application
can not be covered with varying technologies.
Training a Neural Network can take a long time so there are techniques being developed that try and speed this up. One such technique
is to use Genetic Algorithms to select more appropriate weights in a Back-propagation Neural Network, getting rid of the worst solutions
helping the Neural Network to 'converge' more rapidly (Munakata (2), 1998).
A mixture of technologies could be used to further enhance a particular system. A simple example could be the hardware/software
monitoring example, described above, where the use of a Genetic Algorithm could be introduced to work out which revision of a
particular piece of software is going to be most cost effective (taking into account factors such as upgrade costs and indirect costs
as employees attempt to match document versions).
Real Time products are available that make use of Expert Systems, Neural Network and Genetic Algorithm technology to bring classes
and objects programming to the user. Gensym (WWW2) produce G2, which is a software environment for creating applications that
can manage complex dynamic operations. Using object-oriented programming, systems can be continuously monitored, diagnoses made
on time-critical problems and corrective action carried out as well as operating the systems at optimal conditions. Tools such as 'ReThink'
are available to work with G2 allowing modelling and 'what-if' scenarios to be looked at in close detail.
We have only described the technologies in very general terms, there is far more detail that could be gone in to. Although we have yet to
see 'real intelligence' that we could define as 'life', we do have systems available that can be 'trained' (i.e. supervised). Furthermore,
there are Neural Network based on systems such as the Kohonen model that are 'multi-layered' and 'unsupervised', that is, they use 'competitive
learning' techniques to deduce patterns or groupings without external guidance (Munakata (4), 1998). Often these categorisation techniques
are used in conjunction with other methods.
It is often quoted that Neural Networks have been shown to provide more reliable diagnoses than the best doctors in some instances. We
have described systems that could make life and death decisions. By design no Artificial Intelligence System is going to 100% correct all
the time. Many systems produce Certainty Factors that are used as guidance for reliability. If neural networks make decisions, in these
days of increasing litigation who is responsible if it goes wrong?
Perhaps there needs to be a cultural change as we humans begin to 'work with' systems such as CyC (Appendix C) that may, one day, be able
to reason with us having gained a common sense knowledge base that is at least equivalent to our own. Perhaps then we will be seeing the
emergence of a new true intelligence.
(Rich (4), 1991):
Back-propagation uses a sigmoid activation function (0 to 1) rather than the stepwise activation function (0 or 1) used by the perceptron. .
The forward pass generates outputs which can be compared to the target outputs. Error estimates are calculated. Then, weights
connected to the output units are adjusted to reduce these errors and error estimates for the hidden layers back to the inputs are
also adjusted (backward pass), an epoch has been completed once the whole set of patterns has been examined. It takes many epochs
for a network to be trained.
Recognising a face has an infinite number of inputs, so generalising is necessary. A back-propagation network can do this for a few
different sized letters (e.g. A, B etc.) and then learn to recognise any size of letter.
(Munakata (3), 1998).
The following is a worked example for the Stationary Optimisation Problem described earlier:
The objective is to minimise how much money is being spent on stationary with out incurring costs through lack of materials being available.
We could take each department in isolation and look at the monthly stationary order. A particular department may have the following
order in a particular month:
The list of materials in a real situation would be much larger, however this list is easier to work with for example purposes.
For an individual department, that uses it's own varying amounts of stationary there will be an optimum level of materials to hold, not
too little so that the department frequently runs out and has to make emergency orders etc. And not too much so that cash flows out
of the company too rapidly. Rather than use actual amounts we can use a binary notation such that '1' can mean the amount ordered
is the optimum amount and '0' can mean that either more than or less than the optimum amount was ordered.
If we take a random selection of three months of orders for a particular department we could have the following as our first set of solutions:
||1 0 1 1 0 1
||0 1 1 1 1 0
||1 0 0 0 1 0
Where 'i' is the solution number and 'Ai' is the solution string where the first digit represents whether the target number
of pencils were ordered, the second digit the number of pencils and so on.
There needs to be way of identifying the 'fitness' of each solution. In the real world this would involve looking at relative gains and losses for
not reaching the target ordered amounts for each item. Not reaching the target for some items may have a greater impact than other items.
The fitness (fi ) may be calculated from looking at previous experience. For the purposes of this example I have assigned fairly arbitrary
values based roughly on how many 1's there are in the solution. From the fitness value we can gain a 'fitness probability'
(pi = fi/F) where F is the
'total fitness' of the population (group of solutions). Finally, we can calculate the 'expected count' (n.pi)
where n is the number of solutions within the population.
||1 0 1 1 0 1
||0 1 1 1 1 0
||1 0 0 0 1 0
We could now produce 3000 sets of solutions such that 1000 strings of i = 1 will be an average, 1750 strings of i = 2 will be an average
and 249 strings of i = 3 will be average. This is in accordance with the expected counts in the last column, where n is 1000 (it could
be any number of course). We achieve this by picking a 3 digit random number to match the probability in selecting a string (Monte Carlo method):
|Pick a random no. between..
||String no. (i)
|000 - 333
|334 - 916
|917 - 999
From the above table we can see therefore that a random number has a greater probability of being between 334 and 916. So, to create a
new 'mating pool' we generate a random number, say 416 which generates a string no. 2, then 245 to give a string no. 1 and then say, 785 to
give another string no. 2. This results in the following:
||0 1 1 1 1 0
||1 0 1 1 0 1
||0 1 1 1 1 0
At this stage, 'Crossover' can take place depending on a pre-determined probability. This means that two solutions are picked and bits are
swapped determined by a crossing site (or dividing point) picked at random.
E.g. for solutions i = 2 and i = 3, A2 = 1 0 1 || 1 0 1 and A3 = 0 1 1 || 1 1 0 where ||
indicates the crossing site.
Swapping the last bits gives, A2' = 1 0 1 || 1 1 0 and A3' = 0 1 1 || 1 0 1
(Very occasionally, at this stage of the iteration, a random mutation can be thrown, this might mean a bit change from '0' to '1', or vice-versa,
at some random position within the solution string)
At the end of this first iteration we are left with the following population:
||0 1 1 1 1 0
||1 0 1 1 1 0
||0 1 1 1 0 1
In this crude example it can be seen that the Total and the Average has increased. We would expect this to improve even further
after more iterations.
(Rich (5), 1991):
CYC - is a very large knowledge base project aimed at capturing human common sense knowledge (10 million objects). More
comprehensive than CD, it contains representations of events, objects, attitudes etc.
Once the 'hand-coding' has happened, then we should be able to feed information automatically (encyclopaedias etc.).
CYCL - is CYC's representation language, a frame-based system which also has a constraint language that allows the expression of
arbitrary first-order logical expressions. There is the Epistemological Level (EL)(facts) and the Heuristic Level (HL)(Inference templates) of representation.
- Corne (1), Dr David & Oates, Martin, 1998, A Two-Page Genetic Algorithms primer Reading University, pp. 1
- Davalo (1), Eric & Naim, Patrick, 1991, Neural Networks, MacMillan, ISBN 0-333-54996-1, pp. 47
- Munakata (1), Toshinori, 1998, Fundamentals of the New Artificial Intelligence Springer, ISBN 0-387-98302-3, pp. 65
- Munakata (2), Toshinori, 1998, Fundamentals of the New Artificial Intelligence Springer, ISBN 0-387-98302-3, pp. 82
- Munakata (3), Toshinori, 1998, Fundamentals of the New Artificial Intelligence Springer, ISBN 0-387-98302-3, pp. 71
- Munakata (4), Toshinori, 1998, Fundamentals of the New Artificial Intelligence Springer, ISBN 0-387-98302-3, pp. 59
- Rich (1), Elaine & Knight, Kevin, 1991, Artificial Intelligence, McGraw Hill, ISBN 0-07-100894-2, pp. 29
- Rich (2), Elaine & Knight, Kevin, 1991, Artificial Intelligence, McGraw Hill, ISBN 0-07-100894-2, pp. 23
- Rich (3), Elaine & Knight, Kevin, 1991, Artificial Intelligence, McGraw Hill, ISBN 0-07-100894-2, pp. 109
- Rich (4), Elaine & Knight, Kevin, 1991, Artificial Intelligence, McGraw Hill, ISBN 0-07-100894-2, pp. 500-509
- Rich (5), Elaine & Knight, Kevin, 1991, Artificial Intelligence, McGraw Hill, ISBN 0-07-100894-2, pp. 288
- WWW1 - http://www.vauxhall.co.uk/cgi-bin/tn-new.pl
- WWW2 - http://www.gensym.com