|Last Updated: Mon Jan 27 11:18:09 UTC 2014|
The market for digital computers has been traditionally driven by performance per dollar, the dollar being split into maintenance, acquisition and software portability costs. Today's array of high performance microprocessors is a direct product of that environment offering substantial gains in performance, compact ness, price and reliability against the minicomputers of the early seventies.
Judicious application of computer technology in weapon system design can offer dramatic gains in system capability, while substantially improving mission availability through reliability, redundancy and ease of implementing self test and fault diagnostic capabilities. Poor application of computers on the other hand can result in marginal gains in capability with poor mission availability, increased operator workload and a nightmare of escalating hardware and software maintenance costs. Fear of the latter has been a major cause of the reappearance of the fifties' gumsight lobby, calling for simple and cheap aircraft.
Recent combat experience however suggests that the importance of airborne computers will rise as the density and sophistication of hostile air defences continues to grow. No human can handle the information flow required to confuse and jam dozens of enemy radar directed weapons, while trying to acquire and destroy his target. In a protracted conflict sustained attrition of a strike force at levels even as low as 5% can deplete a force size down to half within weeks (1), therefore dollars spent on improving survivability are a good investment.
Argentina chose to ignore this during the Falklands conflict and suffered tremendous losses particularly in skilled aircrew, a resource most difficult to replace.
Airborne Computers and the Microprocessor, a Brief History
It is hardly a coincidence that the development of the microprocessor has in many ways paralleled the development of airborne computer systems - both were born of the late sixties minicomputer. The minicomputer was a radical change from the earlier trend to build bigger and more powerful machines, finally it was recognised that small and moderately powerful machines could be put to many uses and the proliferation of integrated circuit technology was a major incentive to move in that direction.
Like all computers, minis were made of the basic building blocks: memory to store the programs which it will execute, a central processor which fetches programs step by step and executes them, and communication circuits to allow it to talk to the outside world.
Unlike larger machines the minis used fairly simple internal architectures and were made up of less than a dozen printed circuit cards, the whole package weighing of the order of a hundred pounds and consuming several hundred watts of electrical power. These machines were quite clearly candidates for use in tactical aircraft, other than their physical attributes they were also capable of being easily programmed at an assembler (direct machine instruction code) level rather than high level language (eg Fortran, Pascal, Jovial, ADA, Basic, PL/1 etc) level which allowed the programmers to squeeze every ounce of performance out of them.
The first major tactical airborne weapon system to use a mission computer was the General Dynamics F-111D. This aircraft was the product of political upheavals resulting from the USAF TAC's performance in Vietnam. The poor accuracy of the F-105s had led to heavy attrition over time and the F-111A was seen to be the answer. The F-111A and its RAAF cousin the F-111C are fully analogue weapon systems. In analogue systems electrical circuits are used which have behaviour that closely emulates the pattern of the mathematical expression to be computed. In many instances this represents the cheapest and simplest way of electronically computing a mathematical expression, and analogue circuits have the nice property of often gracefully degrading performance when they fail rather than simply dying.
These are convenient features but they cannot outweigh the inherent disadvantages of analogue computing - for each extra function in a system extra hardware is required, which increases complexity increasing cost and reducing reliability. The latter is exacerbated by the need for highly skilled technical staff to debug the equipment. Analogue circuits can drift with age and performance can be seriously affected by temperature. The ultimate weakness of analogue technology lies however in its fundamental inflexibility- it is forever hardwired to perform a particular function.
Though the F-111A offered a quantum leap in capability with its automated all weather bombing and navigation system, it was perceived that further gains could be extracted by increasing the accuracy of the bombing and navigation system and cutting the crew's workload. As this required the integration of Doppler equipment and fitting of multipurpose cockpit displays, only one technological choice remained - use a digital computer as the nucleus of the weapon system. The resulting Mk II avionic suite was built around an AN/AYK-6 minicomputer which tied into an AJN-16 nav-attack system and AYN-3 and AVA-9 CRT display sets.
This MDC F/A-18A was another milestone in weapon system development. It has 2 mission computers tied into all avionic systems via 3 redundant high speed busses. Though the F/A-18A represented the cutting edge in 1978 system technology and is to date unsurpassed, many of its early generation micro-processors are by now hopelessly obsolete.
Late 1968 thus saw a new era come into being - the age of the digital computer based airborne weapon system. The F-111D was a disaster operationally. Because of the limited performance of the early minicomputer and its comparative bulk, many system functions (cockpit display control and management) were still performed in electronic hardware rather than being implemented as software running on the computer. Though this hardware was digital and thus much more robust and reproducible than analogue hardware, the relative simplicity of the chips used at the time resulted in some highly complicated circuits having to be built.
That in turn would have caused cooling problems which in turn caused reliability problems. Digital systems seldom degrade gracefully and these factors ultimately resulted in the aircraft having a mission availability often quoted as low as 35%. In spite of its problems the F-111D had a major advantage over its predecessors as its weapon system functions were programmed in computer software and thus extremely flexible.
Shifting the complexity of the weapon system from hardware into software brings several advantages. Above all the hardware can be simpler and far more modular which improves reliability and eases trouble-shooting. The actual operating modes of the system can be set up in dedicated software modules which reside in the machine's memory, when not used they will at the worst only occupy memory space in the computer without degrading system reliability as extra hardware would.
To incorporate new modes or types of munition only requires revision of the software ideally only of the particular module. The price to be paid is a massive of design costs into the software area. Software costs usually amount to 7-10 times the cost of the hardware which it runs on and maintaining and updating the software throughout the lifetime of the weapon system can be even more expensive as the maintenance programmer must understand the existing software (usually without access to the expertise of the original designers) to be able to modify it sensibly.
Reliability of the software is yet another matter as a single mistyped character could cause a major problem if it's in a critical place (a story is told of a space probe which had to be destroyed shortly after launch due to a semicolon accidently substituted for a comma) though bugs of this sort should be found before the system enters operation.
Significantly the AYK-6/AJN-16 system was retained in the final F-111F which represented the sensible design compromise which the F-111D should have been - without the complex hardwired display sets.
While the USAF was battling with the F-111D another significant development was taking place: the first microprocessors were being designed. A microprocessor is a single chip (or sometimes set of several chips) which implements the processor portion of a computer. In most instances it requires external memory and support circuits to operate but these are fairly easy to build. The result is a very flexible building block.
The first commercially sold microprocessor was the rudimentary Intel 4004 chip-set, which was soon (April 1972) superceded by the 8-bit 8008 microprocessor. A revolution in electronics had begun.
The first generation of microprocessors lacked the computing power to be useful as anything but simple controllers in instruments, industrial equipment and peripherals. The pace of development accelerated however and by 1975 a multitude of various microprocessor types had hit the market (2, 3). Many of these already had the computing power to challenge the low end of the minicomputer market and as such many were sold.
A major development was the introduction of single board computers - a single printed circuit card with a microprocessor, memory and support chips. This species has thrived ever since as building blocks in every imaginable application. The Intel 8080, Motorola 6800 and Zilog Z80 are in perspective the most significant devices of the generation. Many of these are still being used in low end designs.
Early microprocessors had a major impact in the commercial market, particularly the newly born personal computer arena but were relatively slow to enter the arena of airborne systems. In that area steady progress had been made with introducing computers in tactical systems. The new MDC F-15A had a highly automated single seat cockpit, the weapon system was built around an IBM 32-bit minicomputer. The radar employed a hardwired digital signal processor (essentially a specialised high speed computer for number crunching digital filter algorithms).
At this stage it was apparent that internal cabling problems would arise as the volume of equipment tied into the computers grew. The USAF aware of the potential problem subsequently introduced MIL-STD-1553A. 1553A is a standard for internal computer communication within airborne systems-essentially it replaced computer-to-subsystem wiring with a single communication channel for all systems in the aircraft.
This channel is termed an avionic multiplex bus and is implemented by a single twisted pair cable which is strung from black box to box throughout the airframe. Through the cable messages are instantaneously (almost!) broadcast to all of the subsystems in the aircraft. In the 1553 communication protocol the main computer is the 'master' and controls all communication traffic through the bus to, from and between slave devices which may number 30.
The introduction of 1553 was a major breakthrough in hardware standardization because all equipment could be built to speak the same language, in computer terms. The weapon system designer need only assemble a set of off-the-shelf black boxes, a central computer, a bus cable and create a weapon system by writing the software to tie it all together. Though the F-16 employed 1553 for some subsystems, 1553 was to hit the headlines in 1978.
The GD F-111 D was a milestone in weapon system development being the first tactical aircraft with a digital computer based weapon system. This F-111F succeeded the F-111D but retained its AYK-6/AJN-16 computer/BNS - the unparalleled flexibility of a digital computer vastly eases integration with sophisticated subsystems such as the AVQ-26 Pave Tack infra-red targeting pod.
The latter phase of the seventies saw major developments in microprocessors. The 16-bit microprocessor had arrived, Intel releasing the 8086, Motorola the 68000, Zilog the Z8000 and DEC the LSI-11. These chips had all the number crunching power of an early seventies low performance minicomputer and were similar to program, yet mass produced cost little more than their 8-bit predecessors (excl. LSI-11). Their internal structures resembled early seventies minis. The result was a quantum leap in the performance of personal computers and single board computers.
Nineteen Seventy Eight was very significant with the roll-out of the MDC F/A-18A Hornet fighter bomber. The 'Tron Machine' was as significant a development as the F-111D was a decade earlier. The F-18 weapon system was built around a pair of AYK-14 Mission Computers which tied into three separate 1553 busses.
Virtually every piece of avionic equipment in the aircraft was tied into these busses, those that couldn't tied into a fudge box which could. The cockpit employed four CRT displays, two of which had enough built-in intelligence to each control one of the remainder.
The aircraft's flight controls were also digital, using four channels driven by redundant ASW-44 flight control computers. The stores management system is built around an 8-bit microprocessor which controls a decoder in each weapon station. Almost every subsystem had a bit of intelligence built in.
The result is a weapon system which is almost fully configured in software. This was seen as the only way of coping with the very diverse mission requirements of air-superiority and air-ground strike. The F-18 pilot can be depressing a single switch to command the software to reconfigure the whole weapon system for either mission. Throttle and stick controls allow him to command all subsystems required for dogfighting: Voice annunciation ('Bitching Betty') is used to alert him to problems in the system, or eg bingo fuel level. Much has been written about the operating modes of the F-18 which is truly a milestone in weapon systems development (and further discussion of which exceeds the scope of this article), but what is significant is that it came just a bit too early to really exploit the performance leap in microprocessors.
The F-18A as a system uses a centralised architecture the nucleus of which are the two mission computers which do spend a fair amount of their time handling communications and controlling the less intelligent of the subsystems. Using more powerful microprocessors in subsystems drastically reduces the amount of time the main computers spend servicing the subsystems, this time can be put to other uses. Significantly local computing power can provide a far better self test capability and in flight integrity monitoring (also easier to implement as it needn't be wedged into the mission computer software).
Irrespective of the degree of distributed computing power present, the fully bussed system architecture provides a quantum leap in flexibility of hardware configuration. Obsolete black boxes can be substituted for with new plug in replace ments, in some instances the software changes could be minimal. This flexibility can be seductive and can encourage designers to add in more features than really necessary which can eventually hurt in software and hardware maintenance costs.
The turn of the decade saw further developments in the microprocessor area. The software suppliers had barely caught up with the 16-bit machines when second generation 16-bit chips became available. These chips have sophisticated internal architectures comparable to mid seventies minicomputers and many offer comparable computing power. The pace of development has become so rapid that many manufacturers of microprocessor based equipment intentionally structure their designs to allow plug in replacement of processor and memory cards with higher performance replacements without the need to modify the software. This approach has also been followed in some weapon system upgrade programs and represents a very cost effective approach to improving hardware performance.
At this instant in time a generation of 32-bit supermicroprocessors is about to reach the market. These machines will devastate the low and mid range minicomputer market (or what remains of it) as they are for all practical purposes exactly that. What is appealing is that by 1990 these chips will cost a fraction of the current price (the cost of 8-bit micros dropped in a decade from hundreds of dollars to dollars apiece) which given their inherent reliability will allow the design of highly failure tolerant multiple processor machines. These will be essential to cope with the growing demand for computing power to handle tasks such as sensor fusion or artificial intelligence.
Both the US Army LHX scout helo program and the USAF Advanced Tactical Fighter (ATF) program will rely on the availability of a lot of cheap and reliable computing power.
The US DoD is heavily investing in many technology areas, the Very High Speed Integrated Circuit (VHSIC) program is in fact specifically targeted at developing exceptionally fast processor chips. There are two primary thrusts in the drive to improve current microprocessors, one is the need to reduce power consumption and the other to provide more speed. Advances in fabrication technology suggest both goals will be met.
The growth in microprocessor capability over the last decade has significant implications for airborne weapon systems design. Now one can expect all new systems to spend virtually all of their operational lifetime going through incremental multistage hardware upgrades. The ease of integration which characterises this generation of technology allows users to get every hour of life out of a tactical aircraft type.
Aside from the air superiority mission improving aerodynamic performance yields little gain in combat capability as compared to sensor or system upgrades.
System upgrades and support offer promising opportunities for Australia's high technology and defence industries. This is one area where it is quite realistic to aim for self-reliance.
Computer technology has demonstrated its capability as a potent force multiplier and will certainly increase in importance with time. At this instant there is no end in sight to the rise and rise of the microprocessor.
Artificial Intelligence, the New Frontier
One of the most significant results of the ongoing penetration of digital computers into the domain of weapon system design is the tendency to bury the natural characteristics of the equipment under a shell of software.
The pilot of a modern fighter sees a virtual machine with handling and system features all created in software. This approach has given a new meaning to the art of cockpit switchology and in theory automation on this scale allows aircrew to focus more closely on the tactical aspects of the mission. In practice however the ability of the aircraft's computers to access any imaginable piece of information available from the onboard library, vital systems or sensors can result in information saturation. How serious a problem it may be remains to be seen in combat, certainly in a low density scenario the pilot will have adequate time to make good decisions. In a high density scenario that need not apply, an argument pursued vigorously by the USMC when seeking two-seat F/A-18A aircraft.
The point has a lot of merit as flying for instance a fighter escort mission will involve sorting information not only on airborne adversaries but also on the multiplicity of surface-to-air weapons. Given the limitations of what ECM can do in practice there will always exist a need for the aircraft to use penetration tactics in and out of hostile territory. The situation can only worsen as Warpac and Third World countries gain further access to newer classes of weapon technology particularly in the first instance while maintaining numerical superiority.
It is fundamentally a symptom of high technology warfare. To survive participants must possess a broad knowledge base of what each and every adversary system can do and what to do about it very quickly. Knowing and being able to exploit your own system is then equally important, high quality decisions are a must.
The question however remains: can the human being cope with the volume of information under that level of stress? Or equally importantly is it appropriate to sacrifice aircrew with these skills in high attrition scenarios?
The solution could very possibly lie in the use of Artificial Intelligence (AI) techniques.
Until 1981 AI was really a little known scientific discipline somewhere on the boundaries of mathematics, computer science and engineering. Its breadth of scope meant that aspects of it were researched in all of these areas and the US DoD was one of the main sources of research grants. In 1981 the Japanese startled Western technologists with the announcement of their Fifth Generation Computer Systems Project, a very ambitious national goal to develop a family of AI based computers, termed Knowledge Information Processing Systems. These intelligent machines would reason like human experts in given problem areas and were seen as the means to Japanese technological dominance over the world computer market.
The flurry of activity in Western AI research funding which followed has since exceeded Japan's commitment but certainly testifies to the respect the Japanese command in the commercial market. Whatever the outcome of the technology race the Japanese deserve full credit for identifying the strategic and commercial importance of AI. At this instant in time it appears that AI will find many commercial applications before penetrating into the more demanding military (/realtime) application area.
As a discipline AI essentially deals with the principles behind intelligent reasoning. Though many classify AI into computer science this is misleading as computers are really only the medium used to implement AI-based systems. One of the major applications of AI is naturally its use in designing software/hardware to make computers smarter but equally importantly AI can serve to improve human decision making techniques in given areas.
The breadth of AI as a discipline is considerable and many of the basic ideas used have been extracted from areas as diverse as electronic circuit design and mathematical set theory.
The construction of practical software/hardware based AI systems can require considerable knowledge in all of the above scientific disciplines.
The most shallow level at which AI may be applied is in writing software with AI features and running it together with more conventional software on a conventional computer. A step further lies the creation of sophisticated AI based software, eg expert systems, to be run on conventional machines. The final step lies in running AI based software on dedicated AI architecturally specialised computers. These may differ from conventional number crunching machines considerably if high speed is required.
Most practical applications of AI today fall into the area of expert systems. An expert system is a program that reproduces a human reasoning process in solving a particular type or family of problems. Some of the earliest expert systems were developed for medical diagnosis and oilwell analysis, essentially areas where human expertise is very expensive.
Conceptually expert systems are usually built around rule based systems. A rule based system is essentially a set of IF (antecedent), THEN (consequent) rules equipped with a mechanism to compare real world information with these rules and draw conclusions. A hypothetical application could be a target classifier which uses output data from a radar warning and electronic surveillance system (RHAW/ESM) and a radar with a non-cooperative target recognition (NCTR) mode. Obviously it is difficult to use rules such as IF (aircraft has red stars), THEN (aircraft is Soviet) in practice as one need not get close enough to see!
For the above (please see example) however rules such as IF (aircraft has Skip-Spin radar), THEN (aircraft is Flagon) and IF (aircraft is Flagon), THEN (aircraft has 2 AAMs, 2 gunpods, max speed 2.5M, thrust/weight cca 1;1, ...) are more realistic.
In use an expert system will take incoming data and compare it with antecedents (or consequents, depending on the type of system) rule by rule until it finds a match(es). The matching rule is then said to be triggered and its consequent is 'fired' (invoked). In our instance our RHAW/ESM detected a Skip-Spin radar's emissions from the direction of a radar contact, this triggered and fired the 1 st rule which triggered and fired the 2nd rule. The result is advice to the user that the radar contact is a Flagon with possibly 2 AAMs, 2 gunpods, etc, etc.
Practical rule-based systems may use even thousands of such rules if dealing with complicated problems which to some degree explains the need for a lot of computing power. The whole collection of rules is referred to as an inference net as it allows the inference engine to infer facts and from these infer further facts.
Rule based systems have strengths and weaknesses. In the former areas ease of expansion by simply adding in new rules is significant particularly in military applications. The limitations of these systems are significant enough to have led to them being dubbed idiot savants, as they cannot learn, do not understand the reasoning behind their rules (though some can explain their train of reasoning) and view problems only from one perspective.
Furthermore they are ill suited to dealing with shades of grey in the certainty of the facts supplied to them. Techniques for fudging 'certainty factors' into rule based systems exist but usually lack generality.
An approach currently seen by many as the answer lies in the use of fuzzy logic which is a rigorous way of dealing with facts which may have shades of meaning or certainty. Handling natural human language and making decisions based on questionable sensor outputs are both areas which may benefit strongly. The ultimate answer may lie in expert systems which utilise each technique in areas best suited.
Expert systems can be a powerful tool in many problem areas and are already under consideration for areas such as ASW, Inverse Synthetic Aperture Radar target classification, electronic warfare threat classification and response selection (or ECCM) and onboard systems and battle damage management.
DARPA's Strategic Computing Initiative does in fact cover two programs with a heavy content in expert systems, battle management systems and pilot's associate systems. An instance of the latter is a Lockheed project aimed at supporting development of the Advanced Tactical Fighter.
The AI Pilot's Associate uses a distributed architecture with expert systems embedded in most of the critical subsystems, eg offensive, defensive, sensory, nav/com systems, these communicate with a copilot expert system which in turn advises the pilot of the aircraft.
This particular strategy is quite meaningful given the current trend toward distributed processing in avionic systems (see part I, Jan 86 AA) as additional computing power locally can then be very usefully harnessed for low level decision making tasks.
Application of AI in general promises significant gains in weapon system capability. One of the goals of the DARPA initiative lies in developing autonomous battlefield vehicles essentially eliminating the vulnerability of today's remotely piloted/controlled vehicles, jammable communication links.
The Martin Marietta Autonomous Land Vehicle (ALV) falls into that category. The ALV is being developed as an attempt at a robot battlefield scout vehicle equipped with electronic and visual sensors. The biggest problem is seen at this stage according to Denelcor, co-developers, in developing the high speed parallel processing computers necessary for the response times on a live battlefield.
The ALV must be able to both navigate and recognise/avoid threats. Interestingly a legged vehicle is seen as easier to deal with than a wheeled one which must find it's way around obstacles rather than over them.
Autonomous vehicles, whether land based or airborne, offer advantages over manned scouts as they may be built smaller, with greater tolerance to battle damage and may manoeuvre beyond acceptable human physiological limits - all of these factors should provide much better survivability. Similarly the use of such machines 'on "kamikaze" style or high loss rate missions' will force defenders to expend much greater effort on covering critical targets. One could almost envisage retrofit conversions of ageing fighters into autonomous vehicles to get the very last ounce of life out of the platform.
The success of the Israelis using RPVs to attack heavily protected SAM sites provides a good proof of concept. Autonomous land vehicles may be extremely useful as harassment weapons to be left behind while retreating or airdropped behind enemy lines prior to an assault. Similarly providing weapons such as cruise missiles with this level of intelligence will make them extremely difficult to stop. Combined with intelligent submunitions (see TE Sept 84) this class of weapon promises to be a real force multiplier.
One of the interesting ideas that fall out of the use of AI machines in manned and autonomous platforms is the concept of corporate memory, where each and every AI machine is updated over time with the same expertise and experience. AI machines could be debriefed after sorties and, after evaluation, their experience, if useful, distributed to other units to aid in effectiveness. This way the whole force structure can benefit from locally acquired combat experience.
Lockheed ATF Proposal. The USAF ATF program envisages a long range and stealthy air superiority fighter penetrating deep into hostile airspace to destroy high value targets such as AWACS or C3 platforms. Artificial Intelligence is almost essential to handle the high density threat environment.
AI technology is in its infancy at this stage. Given its nature and the mind-boggling rate of development of computer technology it is unwise to make strong predictions as to where AI will be in a decade. It will almost certainly appear in mundane areas such as logistics, fault diagnosis and spare parts management, where it has found commercial use already.
Applied in a limited fashion to tactical aircraft AI could for instance provide for much friendlier voice communication between pilots and their machines, similarly it could be used as suggested earlier to cut down the crews' information management workload. If used in aircraft built-in-test and diagnostic systems it could almost certainly improve availability on deployments as the aircraft could carry a lot of the needed engineering and support expertise with it.
An area where AI could provide massive gains is in aiding sensor based target recognition. Identifying a target first means shooting first which often means winning the engagement. This may be the only way of effectively coping with a determined and numerically superior enemy operating weapon systems of comparable performance.
Electronic warfare and jamming equipment management is an area where AI is currently seen as offering massive gains in capability, providing a jamming system with the intelligence to reject false alarms and deliberate attempts to confuse by adversaries.
Preflight mission planning and tactics development is another potential application area, given the complexity of the air/electronic battlefield of coming years.
Given existing trends in avionic development the integration of AI into systems may prove to be more of an evolutionary exercise rather than a revolutionary one. The problem at hand is more the issue of whether the human element will accept advice from a reasoning, talking machine without seeing it as a threat or adver sary. A significant challenge lies in developing suitable forms of interface.
In the context of Australia's defence forces AI would appear to be a very cost effective force multiplier given the small size of our offensive forces. Whether an attempt to develop a local capability occurs is yet to be seen.
It is obvious that Artificial Intelligence is set to reshape many aspects of weapon system design, whether its greatest contribution will be at the system level or subsystem level is not yet apparent. Either way we can look forward to some very interest ing developments.
Editor's Note 2005: AI has made incremental progress over the two decades since this primer was written, but the ambitious objective of emulating human cognitive skills remains still out of reach. It is curious that the most vociferous advocates of AI are usually individuals with no research background in this area.
|Artwork, graphic design, layout and text © 2004 - 2014 Carlo Kopp; Text © 2004 - 2014 Peter Goon; All rights reserved. Recommended browsers. Contact webmaster. Site navigation hints. Current hot topics.|