Tuesday, October 17, 2006

Computer Chaos

Computer Chaos. -- A whimsical View?

This article was done for publication some years ago and surprise surprise nothing has change. It is here again with a little addition.

How dependent are we becoming on computers and IT as a driving force in Economic or Social Development? Is there blind faith in the expectation of the technology being the next great leap forward or coming to the aid of ailing western like economies, delivering enhanced wealth to replace old economic generators or even social aspects?

Key operational functions which are important to the security and stability of the world, are now inextricably caught up in the inter connection of IT. It might be argued that such functions are seen to be driven more by the reliance on computers and the inbuilt inflexibility of computers systems, than any considered structural design in the application of such technology. Is such laissez faire attitude to the implementation of IT leading the economy and social fabric to a break down point before any one is aware of a problem, or can stop it?

The past decade has seen an unprecedented shift in the expectation of multinational companies, in that there is a rapid ideological shift in the reason for being in business. Shareholder value is taking premier place as the prerequisite for business activity. This feeds the bear or bull frenzy of the money / share market. Automatic share transaction is becoming a volatile force in the money market available to an increasing number of individuals to make transactions and add to the ability of a massive global move in monetary transaction. Downsizing, the drive for faster efficiency, and higher profits together with this shareholder value ideology are leading to the replacement of human activated work with automatic computer systems. This reduction in human operating functions is placing ever-greater reliance on the technology of systems to perform functions that humans previously did.

A person sceptical of IT may see that computer systems architecture is restricting the human interactive processes and chain of responsive command. Designers of systems are assuming a perfect operating state without much regard to the actual work process or its purpose previously carried out by a human operated system. This fixed architecture places restrictions on human interaction; with designers often not even understanding fully what intelligent re constructive work humans did, to make fallible manual systems operate.

Computers are rigid in their output and totally reliant on the value or accuracy of the input information, ‘dross in dross out’. Within this they do what the programmer designs them to do. Operations are systemic; one command relies on another. In a perfect operating environment everything would follow a set path to a ‘correct’ conclusion. Fortunately the human environment is subject to constant illogical irrational intuitive change.

Humans can make value judgements and exercise ‘common sense’ to alter nonsense information. However where the unquestioned accuracy of computer held data is taken as a fact, complication and problems will occur. Changing anything in this environment is not possible if the computer systems have not been programmed to accept the (correct) non-standard input of information or if it contradicts the set programme. Finding where the authority lies to make changes, will not be easy and takes time. Even if the surrounding human activity and operating requirement is enacted by common sense. i.e. an operator knows that something does not make sense or is inaccurate, they can be powerless to change the information. This is assuming that the operator recognises the error or desires to conflict with the IT system to correct an error. A recent dispute with a call centre that ‘communicated’ with another call centre, made me appreciate this difficulty.

Most operators of computers are becoming handcuffed by the technology and accept that ‘systems go down’. A fabulous excuse sometimes to do nothing, any action is then held whilst being prepared to wait until systems are up and running again.
Knowing what the problem was or how a minor malfunction occurred or to make sure it can not happen again, is not the operator’s job, that is the domain of the systems / programme managers and they do not generally believe in the operators need to know.

Although Computer hardware is generally more reliable there are a number of problems that are likely to be of growing concern and if combined could be of catastrophic proportions.

The hardware is now of such densities that 'stray' electron / radio surges can possibly disable or false charge the system – have it do what it is not supposed to. This is already of some concern as you are asked to not use your mobile in a hospital, aeroplane, garage etc even though the risk is minimal!

Systems could increasing go wrong as a result of a step increase in the hidden historical software and architectural failures, this is of particular relevance in the interface between human requirement and the requirements of the organisation, its systems processes and computerised automatic responses.

The increasing interconnectivity of IT system on a global basis. The speed and scope of the Internet expansion, once access time improves, could offer a much faster infection rate of bugs and hacking. Corruption and a financial melt down caused by deliberate or non-concerted calamities are real possibilities.

Human fall out, as technology moves up the cost of staying in the e world increases. Those with the resources can keep pace. Economically inactive people cannot.

The initial instigation of computer installations is driven by the need to process information faster with reduced human input. Problems start with the lack of understanding on how and why the existing human operation works. Those involved in the active process may not be consulted, as the decision to go into IT systems is often a sole, corporate managerial one. Assuming that there is a systems strategy, the next flaw will be with the people designing the IT systems. They may start out with a clear brief but in the design and implementation process, get diverted by changes in the brief. Key staff leave and the person assuming responsibility does not realise the full brief, interpreting it in their own way and laying in the final faulty foundation.

The final soft / hard system may then not give what it was ultimately required to do, is forced to be adapted, or is cancelled like the examples of the Horizen project for the Post Office. - a Social Security payment system, Trawlerman a Secret Service id checker system dumped after £30m and Wessex Health management system which lost £40m, and the passport debacle to name a few.

On a much larger scale the recent millennium fiasco has yet to cause the chaos forecasted yet it is costing the western economy billions to provide safeguards. It is notable that a number of ‘developed, developing and Asian’ countries have done very little as it is viewed as an inconsequential risk. Linking this to the every burgeoning web use, which is (from a democratic point of view thankfully) unregulated, does much for the computer buffs but drives an increasing gap between the actual everyday users and the black art programmer fixers. Few people had the foresight to recognise or have the expectation for the growth of computer technology. Even over the past half-decade when it could clearly be seen, no compatible guiding statutes for safety nets, have been adopted by any country.

The potential conflict of the IT implementation and humans requirements together with this lack of ‘guiding’ foresight and the continued blind belief in the application of computers might lead, at some stage to a virtual computer grid lock of catastrophic proportions. Therefore the future danger and problems with say, ‘the bug,’ has little to do with the lack of ability in handling a date change but more to do with being bitten by computers facilitating compounded programmed bugs themselves and extraneous flash over actions! Of course by that time computers will be so intelligent and run everything that they will have to solve the problem of what to do with all the spare humans who can’t consume the stuff they make.


IEDO.
2000-01-01

1.9.06 an update:

Yet again the government are caught out in not understanding the complexity of installing a computer system without fully accepting what goes on to deliver a current intelligent labour-intensive paper system. The wrong people are invariable involved in the ring fenced project discussions and they often have no grasp of what systems are active in their work area and do not involve the actual ‘coal face’ workers.

Once a design brief is established by these supercilious ignorant mangers it is not unusual for a change in the requirement of the system to be required by political or other interested parties. That change alters significantly the proposed operating parameters of the brief and the proposed computer system. As the design gets more complicated, the deliver time gets elongated and cost over runs occur. During this stage as a great deal of effort and resources have been applied to the project, there builds a great resistance and reluctance to point out errors and major flaws that have been noted. No one wants to cry wolf as their job / contract to deliver the project will be called into question, so the now miss-designed project goes on to its final ignominious end.

Failure is near inevitable with inimical conception or financial limitation imposed on the project with a lack of experienced expertise and knowledge or a later imposed operating budget that does not match the desired outcome. These constraint more than anything else lead the project to underachieve its objective and usually over extend before the errors are accepted as becoming too large to justify a continuance to an uncertain conclusion.

Another good example of grandiose failure late is the SS payment benefit system of 6.8bn spent, now been quietly dropped.

This is on top of the national programme for the NHS at £12.4bn, which is suffering problems with the internal different systems architecture of region health authorities not being compatible and not sharing similar platforms of information, leading to a very slow response time, insufficient access space / handling capacity and difficult log on by doctor’s surgeries.

Now enter the idea of identity cards, an all encompassing data system that will be developed for unspecified purpose. (Control of the populace?) Given the history of government expenditure and experience in this field, one can already see the problems and miss use of this the most ambitious of computerisation projects to date.

© Renot 2005-2006

0 Comments:

Post a Comment

<< Home