Vista Higher Learning Coupon Code

- 11.07

OLC Collaborate Las Vegas 2016 - OLC
photo src: onlinelearningconsortium.org

APL (named after the book A Programming Language) is a programming language developed in the 1960s by Kenneth E. Iverson. Its central datatype is the multidimensional array. It uses a large range of special graphic symbols to represent most functions and operators, leading to very concise code. It has been an important influence on the development of concept modeling, spreadsheets, functional programming, and computer math packages. It has also inspired several other programming languages. As of 2018, it is still used for some applications.


Vista Higher Learning
photo src: vistahigherlearning.com


Maps, Directions, and Place Reviews



History

The mathematical notation for manipulating arrays which developed into the APL programming language was developed by Iverson at Harvard University starting in 1957, and published in his A Programming Language in 1962. The preface states its premise:

Applied mathematics is largely concerned with the design and analysis of explicit procedures for calculating the exact or approximate values of various functions. Such explicit procedures are called algorithms or programs. Because an effective notation for the description of programs exhibits considerable syntactic structure, it is called a programming language.

In 1960, he began work for IBM and, working with Adin Falkoff, created APL based on the notation he had developed. This notation was used inside IBM for short research reports on computer systems, such as the Burroughs B5000 and its stack mechanism when stack machines versus register machines were being evaluated by IBM for upcoming computers.

Also in 1960, Iverson used his notation in a draft of the chapter A Programming Language, written for a book he was writing with Fred Brooks, Automatic Data Processing, which would be published in 1963.

As early as 1962, the first attempt to use the notation to describe a complete computer system happened after Falkoff discussed with Dr. William C. Carter his work to standardize the instruction set for the machines that later became the IBM System/360 family.

In 1963, Herbert Hellerman, working at the IBM Systems Research Institute, implemented a part of the notation on an IBM 1620 computer, and it was used by students in a special high school course on calculating transcendental functions by series summation. Students tested their code in Hellerman's lab. This implementation of a part of the notation was called Personalized Array Translator (PAT).

In 1963, Falkoff, Iverson, and Edward H. Sussenguth Jr., all working at IBM, used the notation for a formal description of the IBM System/360 series machine architecture and functionality, which resulted in a paper published in IBM Systems Journal in 1964. After this was published, the team turned their attention to an implementation of the notation on a computer system. One of the motivations for this focus of implementation was the interest of John L. Lawrence who had new duties with Science Research Associates, an educational company bought by IBM in 1964. Lawrence asked Iverson and his group to help use the language as a tool to develop and use computers in education.

After Lawrence M. Breed and Philip S. Abrams of Stanford University joined the team at IBM Research, they continued their prior work on an implementation programmed in FORTRAN IV for a part of the notation which had been done for the IBM 7090 computer running on the IBSYS operating system. This work was finished in late 1965 and later named IVSYS (for Iverson system). The basis of this implementation was described in detail by Abrams in a Stanford University Technical Report, "An Interpreter for Iverson Notation" in 1966. this was formally supervised by Niklaus Wirth. Like Hellerman's PAT system earlier, this implementation did not include the APL character set but used special English reserved words for functions and operators. The system was later adapted for a time-sharing system and, by November 1966, it had been reprogrammed for the IBM System/360 Model 50 computer running in a time sharing mode and was used internally at IBM.

A key development in the ability to use APL effectively, before the wide use of cathode ray tube (CRT) terminals, was the development of a special IBM Selectric typewriter interchangeable typing element with all the special APL characters on it. This was used on paper printing terminal workstations using the Selectric typewriter and typing element mechanism, such as the IBM 1050 and IBM 2741 terminal. Keycaps could be placed over the normal keys to show which APL characters would be entered and typed when that key was struck. For the first time, a programmer could type in and see proper APL characters as used in Iverson's notation and not be forced to use awkward English keyword representations of them. Falkoff and Iverson had the special APL Selectric typing element, 987 and 988, designed in late 1964, although no APL computer system was available to use them. Iverson cited Falkoff as the inspiration for the idea of using an IBM Selectric typing element for the APL character set.

Some APL symbols, even with the APL characters on the Selectric typing element, still had to be typed in by over-striking two extant element characters. An example is the grade up character, which had to be made from a delta (shift-H) and a Sheffer stroke (shift-M). This was necessary because the APL character set was larger than the 88 characters allowed on the typing element.

The first APL interactive login and creation of an APL workspace was in 1966 by Larry Breed using an IBM 1050 terminal at the IBM Mohansic Labs near Thomas J. Watson Research Center, the home of APL, in Yorktown Heights, New York.

IBM was chiefly responsible for introducing APL to the marketplace. APL was first available in 1967 for the IBM 1130 as APL\1130. It would run in as little as 8k 16-bit words of memory, and used a dedicated 1 megabyte hard disk.

APL gained its foothold on mainframe timesharing systems from the late 1960s through the early 1980s, in part because it would run on lower-specification systems that had no dynamic address translation hardware. Additional improvements in performance for selected IBM System/370 mainframe systems included the APL Assist Microcode in which some support for APL execution was included in the processor's firmware, versus APL being a software product exclusively. Somewhat later, as suitably performing hardware was finally growing available in the mid- to late-1980s, many users migrated their applications to the personal computer environment.

Early IBM APL interpreters for IBM 360 and IBM 370 hardware implemented their own multi-user management instead of relying on the host services, thus they were their own timesharing systems. First introduced in 1966, the APL\360 system was a multi-user interpreter. The ability to programmatically communicate with the operating system for information and setting interpreter system variables was done through special privileged "I-beam" functions, using both monadic and dyadic operations.

In 1973, IBM released APL.SV, which was a continuation of the same product, but which offered shared variables as a means to access facilities outside of the APL system, such as operating system files. In the mid-1970s, the IBM mainframe interpreter was even adapted for use on the IBM 5100 desktop computer, which had a small CRT and an APL keyboard, when most other small computers of the time only offered BASIC. In the 1980s, the VSAPL program product enjoyed wide use with Conversational Monitor System (CMS), Time Sharing Option (TSO), VSPC, MUSIC/SP, and CICS users.

In 1973-1974, Dr. Patrick E. Hagerty directed the implementation of the University of Maryland APL interpreter for the 1100 line of the Sperry UNIVAC 1100/2200 series mainframe computers. At the time, Sperry had nothing. In 1974, student Alan Stebbens was assigned the task of implementing an internal function.

In the 1960s and 1970s, several timesharing firms arose that sold APL services using modified versions of the IBM APL\360 interpreter. In North America, the better-known ones were I. P. Sharp Associates, Scientific Time Sharing Corporation (STSC), Time Sharing Resources (TSR), and The Computer Company (TCC). CompuServe also entered the market in 1978 with an APL Interpreter based on a modified version of Digital Equipment Corp and Carnegie Mellon's, which ran on DEC's KI and KL 36-bit machines. CompuServe's APL was available both to its commercial market and the consumer information service. With the advent first of less expensive mainframes such as the IBM 4300, and later the personal computer, by the mid-1980s, the timesharing industry was all but gone.

Sharp APL was available from I. P. Sharp Associates, first as a timesharing service in the 1960s, and later as a program product starting around 1979. Sharp APL was an advanced APL implementation with many language extensions, such as packages (the ability to put one or more objects into a single variable), file system, nested arrays, and shared variables.

APL interpreters were available from other mainframe and mini-computer manufacturers also, notably Burroughs, Control Data Corporation (CDC), Data General, Digital Equipment Corporation (DEC), Harris, Hewlett-Packard (HP), Siemens AG, Xerox, and others.

Garth Foster of Syracuse University sponsored regular meetings of the APL implementers' community at Syracuse's Minnowbrook Conference Center in Blue Mountain Lake, New York. In later years, Eugene McDonnell organized similar meetings at the Asilomar Conference Grounds near Monterey, California, and at Pajaro Dunes near Watsonville, California. The SIGAPL special interest group of the Association for Computing Machinery continues to support the APL community.

In 1979, Iverson received the Turing Award for his work on APL.

Filmography, Videos: Over the years APL has been the subject of more than a few films and videos. Some of these include:

  • "Chasing Men Who Stare at Arrays" Catherine Lathwell's Film Diaries; 2014, film synopsis - "people who accept significantly different ways of thinking, challenge the status quo and as a result, created an invention that subtly changes the world. And no one knows about it. And a Canadian started it all... I want everyone to know about it."
  • "The Origins of APL - 1974 - YouTube", YouTube video, 2012, uploaded by Catherine Lathwell; a talk show style interview with the original developers of APL.
  • "50 Years of APL", YouTube, 2009, by Graeme Robertson, uploaded by MindofZiggi, history of APL, quick introduction to APL, a powerful programming language currently finding new life due to its ability to create and implement systems, web-based or otherwise.
  • "APL demonstration 1975", YouTube, 2013, uploaded by Imperial College London; 1975 live demonstration of the computer language APL (A Programming Language) by Professor Bob Spence, Imperial College London.

APL2

Starting in the early 1980s, IBM APL development, under the leadership of Dr Jim Brown, implemented a new version of the APL language that contained as its primary enhancement the concept of nested arrays, where an array can contain other arrays, and new language features which facilitated integrating nested arrays into program workflow. Ken Iverson, no longer in control of the development of the APL language, left IBM and joined I. P. Sharp Associates, where one of his major contributions was directing the evolution of Sharp APL to be more in accord with his vision.

As other vendors were busy developing APL interpreters for new hardware, notably Unix-based microcomputers, APL2 was almost always the standard chosen for new APL interpreter developments. Even today, most APL vendors or their users cite APL2 compatibility, as a selling point for those products.

APL2 for IBM mainframe computers is still available. IBM cites its use for problem solving, system design, prototyping, engineering and scientific computations, expert systems, for teaching mathematics and other subjects, visualization and database access and was first available for CMS and TSO in 1984. The APL2 Workstation edition (Windows, OS/2, AIX, Linux, and Solaris) followed much later in the early 1990s.

Microcomputers

The first microcomputer implementation of APL was on the Intel 8008-based MCM/70, the first general purpose personal computer, in 1973. Size of arrays along any dimension could be no larger than 255 and the machine was quite slow, but very convenient for use in education. Interestingly, a significant part of sales were to small businesses, who found it more cost effective and accessible than the time sharing services then available, and for whom the array size limits were no barrier.

IBM's own IBM 5100 microcomputer (1975) offered APL as one of two built-in ROM-based interpreted languages for the computer, complete with a keyboard and display that supported all the special symbols used in the language. While the 5100 was very slow and operated its screen only like a typewriter, its successor the 5110 had more acceptable performance and a read-write addressable text screen. Graphics could be printed on an external matrix printer.

In 1976 DNA Systems introduced an APL interpreter for their TSO Operating System, which ran timesharing on the IBM 1130, Digital Scientific Meta-4, General Automation GA 18/30 and Computer Hardware CHI 21/30.

The VideoBrain Family Computer, released in 1977, only had one programming language available for it, and that was a dialect of APL called APL/S.

A Small APL for the Intel 8080 called EMPL was released in 1977, and Softronics APL, with most of the functions of full APL, for 8080-based CP/M systems was released in 1979.

In 1977, the Canadian firm Telecompute Integrated Systems, Inc. released a business-oriented APL interpreter named TIS APL, for Z80-based systems. It featured the full set of file functions for APL, plus a full screen input and switching of right and left arguments for most dyadic operators by introducing the ~. prefix to all single character dyadic functions such as - or /.

Vanguard APL was available for Zilog Z80 CP/M-based processors in the late 1970s. The Computer Company (TCC) released APL.68000 in the early 1980s for Motorola 68000-based processors, this system being the basis for MicroAPL Limited's APLX product. I. P. Sharp Associates released a version of their APL interpreter for the IBM PC and PC-XT/370. For the IBM PC, an emulator was written that facilitated reusing much of the IBM 370 mainframe code. Arguably, the best known APL interpreter for the IBM Personal Computer was Scientific Time Sharing Corporation's (STSC) APL*Plus/PC.

The Commodore SuperPET, introduced in 1981, included an APL interpreter developed by the University of Waterloo.

In the early 1980s, the Analogic Corporation developed The APL Machine, which was an array processing computer designed to be programmed only in APL. There were three processing units, the user's workstation, an IBM PC, where programs were entered and edited, a Motorola 68000 processor that ran the APL interpreter, and the Analogic array processor that executed the primitives. At the time of its introduction, The APL Machine was likely the fastest APL system available. Although a technological success, The APL Machine was a marketing failure. The initial version supported a single process at a time. At the time the project was discontinued, the design had been completed to allow multiple users. As an aside, an unusual aspect of The APL Machine was that the library of workspaces was organized such that a single function or variable that was shared by many workspaces existed only once in the library. Several of the members of The APL Machine project had formerly spent several years with Burroughs implementing APL\700.

At one time, Bill Gates claimed in his Open Letter to Hobbyists, that Microsoft Corporation planned to release a version of APL, but that never occurred.

An early 1978 publication of Rodnay Zaks from Sybex was A microprogrammed APL implementation ISBN 0-89588-005-9, which is the complete source listing for the microcode for a Digital Scientific Corporation Meta 4 microprogrammable processor implementing APL. This topic was also the subject of his PhD thesis.

In 1979, William Yerazunis wrote a partial version of APL in Prime Computer FORTRAN, extended it with graphics primitives, and released it. This was also the subject of his Masters thesis.

Extensions

Various implementations of APL by APLX, Dyalog, et al., include extensions for object-oriented programming, support for .NET Framework, XML-array conversion primitives, graphing, operating system interfaces, and lambda calculus expressions.


OLC Collaborate Las Vegas 2016 - OLC
photo src: onlinelearningconsortium.org


Design

Unlike traditionally structured programming languages, APL code is typically structured as chains of monadic or dyadic functions, and operators acting on arrays. APL has many nonstandard primitives (functions and operators) that are indicated by a single symbol or a combination of a few symbols. All primitives are defined to have the same precedence, and always associate to the right. Thus, APL is read or best understood from right-to-left.

Early APL implementations (circa 1970 or so) had no programming loop-flow control structures, such as do or while loops, and if-then-else constructs. Instead, they used array operations, and use of structured programming constructs was often not necessary, since an operation could be performed on a full array in one statement. For example, the iota function (?) can replace for-loop iteration: ?N when applied to a scalar positive integer yields a one-dimensional array (vector), 1 2 3 ... N. More recent implementations of APL generally include comprehensive control structures, so that data structure and program control flow can be clearly and cleanly separated.

The APL environment is called a workspace. In a workspace the user can define programs and data, i.e., the data values exist also outside the programs, and the user can also manipulate the data without having to define a program. In the examples below, the APL interpreter first types six spaces before awaiting the user's input. Its own output starts in column one.

The user can save the workspace with all values, programs, and execution status.

APL is well known for its use of a set of non-ASCII symbols, which are an extension of traditional arithmetic and algebraic notation. Having single character names for single instruction, multiple data (SIMD) vector functions is one way that APL enables compact formulation of algorithms for data transformation such as computing Conway's Game of Life in one line of code. In nearly all versions of APL, it is theoretically possible to express any computable function in one expression, that is, in one line of code.

Because of the unusual character set, many programmers use special keyboards with APL keytops to write APL code. Although there are various ways to write APL code using only ASCII characters, in practice it is almost never done. (This may be thought to support Iverson's thesis about notation as a tool of thought.) Most if not all modern implementations use standard keyboard layouts, with special mappings or input method editors to access non-ASCII characters. Historically, the APL font has been distinctive, with uppercase italic alphabetic characters and upright numerals and symbols. Most vendors continue to display the APL character set in a custom font.

Advocates of APL claim that the examples of so-called write-only code (badly written and almost incomprehensible code) are almost invariably examples of poor programming practice or novice mistakes, which can occur in any language. Advocates also claim that they are far more productive with APL than with more conventional computer languages, and that working software can be implemented in far less time and with far fewer programmers than using other technology.

They also may claim that because it is compact and terse, APL lends itself well to larger-scale software development and complexity, because the number of lines of code can be reduced greatly. Many APL advocates and practitioners also view standard programming languages such as COBOL and Java as being comparatively tedious. APL is often found where time-to-market is important, such as with trading systems.

Iverson later designed the programming language J, which uses ASCII with digraphs instead of special symbols.


Albertsons ร‚» Albertsons, Coupons and You
photo src: www.albertsons.com


Execution

Because APL's core objects are arrays, it lends itself well to parallelism, parallel computing, massively parallel applications, and very-large-scale integration or VLSI.

Interpreters

APLNext (formerly APL2000) offers an advanced APL interpreter that operates on Linux, Unix, and Windows. It supports Windows automation, calls to operating system and user defined dynamic-link librarys (DLL), has an advanced APL File System, and represents the current level of APL language development. APL2000's product is an advanced continuation of Scientific Time Sharing Corporation's (STSC) successful APL*Plus/PC and APL*Plus/386 product line.

Dyalog APL is an advanced APL interpreter that operates on AIX, Linux (including on the Raspberry Pi), macOS and Microsoft Windows. Dyalog has extensions to the APL language, which include new object-oriented programming features, many language enhancements, plus a consistent namespace model used for both its Microsoft Automation interface, and native namespaces. For the Windows platform, Dyalog APL offers tight integration with .NET, plus limited integration with the Microsoft Visual Studio development platform.

IBM offers a version of IBM APL2 for IBM AIX, Linux, Sun Solaris and Windows systems. This product is a continuation of APL2 offered for IBM mainframes. IBM APL2 was arguably the most influential APL system, which provided a solid implementation standard for the next set of extensions to the language, focusing on nested arrays.

NARS2000 is an open-source APL interpreter written by Bob Smith, a well-known APL developer and implementor from STSC in the 1970s and 1980s. NARS2000 contains advanced features and new datatypes, runs natively on Windows (32- and 64-bit versions), and runs on Linux and Apple macOS with Wine.

MicroAPL Limited offers APLX, a full-featured 64-bit interpreter for Linux, Microsoft Windows, and macOS systems. The core language is closely modelled on IBM's APL2 with various enhancements. APLX includes close integration with .NET Framework, Java, Ruby, and R. Effective July 11, 2016, MicroAPL withdrew APLX from commercial sale. Dyalog began hosting the APLX website including the download area and documentation.

Soliton Incorporated offers the Sharp APL for UniX (SAX) interpreter, for Unix and Linux systems. This is a further development of I. P. Sharp Associates' Sharp APL product. Unlike most other APL interpreters, Kenneth E. Iverson had some influence in the way nested arrays were implemented in Sharp APL and SAX. Nearly all other APL implementations followed the course set by IBM with APL2, thus some important details in Sharp APL differ from other implementations.

OpenAPL is an open-source software implementation of APL, published by Branko Bratkovic. It is based on code by Ken Thompson of Bell Laboratories, together with contributions by others. It is licensed under the GNU General Public License, and runs on Unix systems including Linux on x86, SPARC, and other central processing units (CPU).

GNU APL is a free implementation of ISO Standard 13751 and thus similar to APL2. It runs on GNU/Linux, and on Windows using Cygwin. It uses Unicode internally. It was written by Jรผrgen Sauermann.

Compilers

APL programs are normally interpreted and less often compiled. In reality, most APL compilers translated source APL to a lower level intermediate language such as C, leaving the machine-specific details to the lower level compiler. Compiling APL programs was a topic discussed often at conferences. Although some of the newer enhancements to the APL language such as nested arrays have rendered the language increasingly difficult to compile, the idea of APL compiling is still under development today.

In the past, APL compiling was regarded as a means to achieve execution speed comparable to other mainstream languages, especially on mainframe computers. Several APL compilers achieved some successes, though comparatively little of the development effort spent on APL over the years went to perfecting compiling into machine code.

As is the case when moving APL programs from one vendor's APL interpreter to another, APL programs invariably will require changes to their content. Depending on the compiler, variable declarations might be needed, some language features would need to be removed or avoided, or the APL programs would need to be cleaned up in some way. Some features of the language, such as the execute function (an expression evaluator) and the various reflection and introspection functions from APL, such as the ability to return a function's text or to materialize a new function from text, are simply not practical to implement in machine code compiling.

A commercial compiler was brought to market by STSC in the mid-1980s as an add-on to IBM's VSAPL Program Product. Unlike more modern APL compilers, this product produced machine code that would execute only in the interpreter environment, it was not possible to eliminate the interpreter component. The compiler could compile many scalar and vector operations to machine code, but it would rely on the APL interpreter's services to perform some more advanced functions, rather than attempt to compile them. However, dramatic speedups did occur, especially for heavily iterative APL code.

Around the same time, the book An APL Compiler by Timothy Budd appeared in print. This book detailed the construction of an APL translator (aplc), written in C, which performed some optimizations such as loop fusion specific to the needs of an array language. The source language was APL-like in that a few rules of the APL language were changed or relaxed to permit more efficient compiling. The translator would emit C code which could then be compiled and run outside of the APL workspace. Another compiler, also named aplc, was later created by Samuel W. Sirlin, based on Budd's work.

The Burroughs/Unisys APLB interpreter (1982) was the first to use dynamic incremental compiling to produce code for an APL-specific virtual machine. It recompiled on-the-fly as identifiers changed their functional meanings. In addition to removing parsing and some error checking from the main execution path, such compiling also streamlines the repeated entry and exit of user-defined functional operands. This avoids the stack setup and take-down for function calls made by APL's built-in operators such as Reduce and Each.

APEX, a research APL compiler, is available under the GNU Public License, per Snake Island Research Inc. APEX compiles flat APL (a subset of ISO N8485) into SAC, a functional array language with parallel semantics, and currently runs on Linux. APEX-generated code uses loop fusion and 'array contraction', special-case algorithms not generally available to interpreters (e.g., upgrade of permutation matrix/vector), to achieve a level of performance comparable to that of Fortran.

The APLNext VisualAPL system is a departure from a conventional APL system in that VisualAPL is a true .NET language which is fully interoperable with other .NET languages such as VB.NET and C#. VisualAPL is also object-oriented and Unicode-based. While it incorporates most of the features of standard APL implementations, the language extends standard APL to be .NET-compliant. VisualAPL is hosted in the standard Microsoft Visual Studio IDE and as such, invokes compiling in the same way as other .NET languages. By producing Common Intermediate Language (CIL) code, it uses the Microsoft just-in-time compiler (JIT) to support 32-bit or 64-bit hardware. Substantial performance speed-ups over standard APL have been reported, especially when (optional) strong typing of function arguments is used.

An APL-to-C# translator is available from Causeway Graphical Systems. This compiler is designed to allow APL code, translated to equivalent C#, to run fully outside of an APL environment. It requires a run-time library of array functions. Some speedup, sometimes dramatic, is visible, but is via optimisations occurring in the .NET Framework.

Matrix optimizations

APL was unique in the speed with which it could perform complicated matrix operations. For example, a very large matrix multiplication would take only a few seconds on a machine that was much less powerful than those today, ref. history of supercomputing and "because it operates on arrays and performs operations like matrix inversion internally, well written APL can be surprisingly fast." This advantage occurred for both technical and economic reasons:

  • Commercial interpreters delivered highly tuned linear algebra library routines.
  • Very low interpretive overhead was incurred per-array, not per-element.
  • APL response time compared favorably to the runtimes of early optimizing compilers.
  • IBM provided microcode assist for APL on several IBM370 mainframes.

Phil Abrams' much-cited paper An APL Machine illustrated how APL could make effective use of lazy evaluation where calculations are not performed until the results are needed, and then only those calculations strictly required. An obvious (and easy to implement) lazy evaluation is the J-vector: when a monadic iota is encountered in the code, it is kept as a representation instead of being expanded in memory; in future operations, a J-vectors contents are the loop's induction register, not reads from memory.

Although such methods were seldom used by commercial interpreters, they exemplify the language's best survival mechanism: not specifying the order of scalar operations or the exact contents of memory. As standardized, in 1983 by ANSI working group X3J10, APL remains highly data-parallel. This gives language implementers great freedom to schedule operations as efficiently as possible. As computer innovations such as cache memory, and single instruction, multiple data (SIMD) execution became commercially available, APL programs are ported with almost no extra effort spent re-optimizing low-level details.


photo src: www.giveawayoftheday.com


Terminology

APL makes a clear distinction between functions and operators. Functions take arrays (variables or constants or expressions) as arguments, and return arrays as results. Operators (similar to higher-order functions) take functions or arrays as arguments, and derive related functions. For example, the sum function is derived by applying the reduction operator to the addition function. Applying the same reduction operator to the maximum function (which returns the larger of two numbers) derives a function which returns the largest of a group (vector) of numbers. In the J language, Iverson substituted the terms verb for function and adverb or conjunction for operator.

APL also identifies those features built into the language, and represented by a symbol, or a fixed combination of symbols, as primitives. Most primitives are either functions or operators. Coding APL is largely a process of writing non-primitive functions and (in some versions of APL) operators. However a few primitives are considered to be neither functions nor operators, most noticeably assignment.

Some words used in APL literature have meanings that differ from those in both mathematics and the generality of computer science.


Descubre Level 3 Answer Key 3 by Vista Higher Learning (2007 ...
photo src: www.ebay.com


Syntax

APL has explicit representations of functions, operators, and syntax, thus providing a basis for the clear and explicit statement of extended facilities in the language, and tools to experiment on them.


Descubre, Level 3 3 by Vista Higher Learning (2007, Hardcover ...
photo src: www.ebay.com


Examples

Hello, World

This displays "Hello, world":

'Hello World,' sample user session on YouTube

A design theme in APL is to define default actions in some cases that would produce syntax errors in most other programming languages.

The 'Hello, world' string constant above displays, because display is the default action on any expression for which no action is specified explicitly (e.g. assignment, function parameter).

Exponentiation

Another example of this theme is that exponentiation in APL is written as "2?3", which indicates raising 2 to the power 3 (this would be written as "2^3" in some other languages and "2**3" in FORTRAN and Python). However, if no base is specified (as with the statement "?3" in APL, or "^3" in other languages), most other programming languages one would have a syntax error. APL however assumes the missing base to be the natural logarithm constant e (2.71828....), and so interpreting "?3" as "2.71828?3".

Pick 6 lottery numbers

This following immediate-mode expression generates a typical set of Pick 6 lottery numbers: six pseudo-random integers ranging from 1 to 40, guaranteed non-repeating, and displays them sorted in ascending order:

The above does a lot, concisely; although it seems complex to a new APLer. It combines the following APL functions (also called primitives and glyphs):

  • The first to be executed (APL executes from rightmost to leftmost) is dyadic function ? (named deal when dyadic) that returns a vector consisting of a select number (left argument: 6 in this case) of random integers ranging from 1 to a specified maximum (right argument: 40 in this case), which, if said maximum >= vector length, is guaranteed to be non-repeating; thus, generate/create 6 random integers ranging from 1-40.
  • This vector is then assigned (<-) to the variable x, because it is needed later.
  • This vector is then sorted in ascending order by a monadic ? function, which has as its right argument everything to the right of it up to the next unbalanced close-bracket or close-parenthesis. The result of ? is the indices that will put its argument into ascending order.
  • Then the output of ? is applied to the variable x, which we saved earlier, and it puts the items of x into ascending sequence.

Since there is no function to the left of the left-most x to tell APL what to do with the result, it simply outputs it to the display (on a single line, separated by spaces) without needing any explicit instruction to do that.

? also has a monadic equivalent called roll, which simply returns one random integer between 1 and its sole operand [to the right of it], inclusive. Thus, a role-playing game program might use the expression ?20 to roll a twenty-sided die.

Prime numbers

The following expression finds all prime numbers from 1 to R. In both time and space, the calculation complexity is O ( R 2 ) {\displaystyle O(R^{2})\,\!} (in Big O notation).

Executed from right to left, this means:

  • Iota ? creates a vector containing integers from 1 to R (if R= 6 at the start of the program, ?R is 1 2 3 4 5 6)
  • Drop first element of this vector (? function), i.e., 1. So 1??R is 2 3 4 5 6
  • Set R to the new vector (<-, assignment primitive), i.e., 2 3 4 5 6
  • The / reduction operator is dyadic (binary) and the interpreter first evaluates its left argument (fully in parentheses):
  • Generate outer product of R multiplied by R, i.e., a matrix that is the multiplication table of R by R (°.× operator), i.e.,
  • Build a vector the same length as R with 1 in each place where the corresponding number in R is in the outer product matrix (?, set inclusion or element of or Epsilon operator), i.e., 0 0 1 0 1
  • Logically negate (not) values in the vector (change zeros to ones and ones to zeros) (~, logical not or Tilde operator), i.e., 1 1 0 1 0
  • Select the items in R for which the corresponding element is 1 (/ reduction operator), i.e., 2 3 5

(Note, this assumes the APL origin is 1, i.e., indices start with 1. APL can be set to use 0 as the origin, so that ?6 is 0 1 2 3 4 5, which is convenient for some calculations.)

Sorting

The following expression sorts a word list stored in matrix X according to word length:

Game of Life

The following function "life", written in Dyalog APL, takes a boolean matrix and calculates the new generation according to Conway's Game of Life. It demonstrates the power of APL to implement a complex algorithm in very little code, but it is also very hard to follow unless one has advanced knowledge of APL.

HTML tags removal

In the following example, also Dyalog, the first line assigns some HTML code to a variable txt and then uses an APL expression to remove all the HTML tags (explanation):

This returns the text This is emphasized text.




Character set

APL has been both criticized and praised for its choice of a unique, non-standard character set. Some who learn it become ardent adherents, suggesting that there is some weight behind Iverson's idea that the notation used does make a difference. In the 1960s and 1970s, few terminal devices and even display monitors could reproduce the APL character set. The most popular ones employed the IBM Selectric print mechanism used with a special APL type element. One of the early APL line terminals (line-mode operation only, not full screen) was the Texas Instruments TI Model 745 (circa 1977) with the full APL character set which featured half and full duplex telecommunications modes, for interacting with an APL time-sharing service or remote mainframe to run a remote computer job, called an RJE.

Over time, with the universal use of high-quality graphic displays, printing devices and Unicode support, the APL character font problem has largely been eliminated. However, entering APL characters requires the use of input method editors, keyboard mappings, virtual/on-screen APL symbol sets, or easy-reference printed keyboard cards which can frustrate beginners accustomed to other programming languages. With beginners who have no prior experience with other programming languages, a study involving high school students found that typing and using APL characters did not hinder the students in any measurable way.

In defense of APL use, APL requires less coding to type in, and keyboard mappings become memorized over time. Also, special APL keyboards are manufactured and in use today, as are freely available downloadable fonts for operating systems such as Microsoft Windows. The reported productivity gains assume that one will spend enough time working in APL to make it worthwhile to memorize the symbols, their semantics, and keyboard mappings.




Use

APL has long had a select, mathematically inclined and curiosity-driven user base, who reference its powerful and symbolic nature. For example, one symbol/character can perform an entire sort; another can perform regression. It was and still is popular in financial, pre-modeling applications, and insurance applications, in simulations, and in mathematical applications. APL has been used in a wide variety of contexts and for many and varied purposes, including artificial intelligence and robotics. A newsletter titled "Quote-Quad" dedicated to APL was published from the 1970s until 2007 by the SIGAPL section of the Association for Computing Machinery (Quote-Quad is the name of the APL character used for text input and output).

Before the advent of full-screen systems and until as late as the mid-1980s, systems were written such that instructions were entered in various field-specific (e.g., science, business) vocabularies. APL time-sharing vendors delivered applications in this form. On the I. P. Sharp Associates timesharing system, a workspace called 39 MAGIC offered access to financial and airline data plus sophisticated (for the time) graphing and reporting. Another example is the GRAPHPAK workspace supplied with IBM's APL, then APL2.

Because of its matrix operations, APL was for some time quite popular for computer graphics programming, where graphic transformations could be encoded as matrix multiplications. One of the first commercial computer graphics houses, Digital Effects, based in New York City, produced an APL graphics product named Visions, which was used to create television commercials and, reportedly, animation for the 1982 film Tron. Digital Effects' use of APL was informally described at several SIGAPL conferences in the late 1980s; examples discussed included the early UK Channel 4 TV logo/ident.

Interest in APL has declined from a peak in the mid-1980s. This appears partly due to lack of smooth migration pathways from higher performing memory-intensive mainframe implementations to low-cost personal computer alternatives. APL implementations for computers before the Intel 80386 released in the late 1980s were only suitable for small applications. Another important reason for the decline is the lack of low cost, standardized and robust, compiled APL executables, usable across multiple computer hardware and OS platforms. There are several APL version permutations across various APL implementations, particularly differences between IBM's APL2 and APL2000's APL+ versions. Another practical limit is that APL has fallen behind modern integrated development environments in debugging abilities and test-driven development (TDD). Thus, while APL remains very suitable for small-to-medium-sized programs, productivity gains for larger projects involving teams of developers would be questionable.

The growth of end-user computing tools such as Microsoft Excel and Microsoft Access has indirectly eroded potential APL use. These are frequently appropriate platforms for what may have been APL applications in the 1970s and 1980s. Some APL users migrated to the programming language J, which offers some advanced features. Lastly, the decline was also due in part to the growth of MATLAB, GNU Octave, and Scilab. These scientific computing array-oriented platforms provide an interactive computing experience similar to APL, but more closely resemble conventional programming languages such as Fortran, and use standard ASCII characters. Other APL users continue to wait for a very low-cost, standardized, broad-hardware-usable APL implementation.

Despite this decline, APL finds continued use in some fields, such as accounting research, pre-hardcoded modeling, DNA identification technology, symbolic mathematical expression and learning. It remains an inspiration to its current user base, and to the design of other languages.




Standards

APL has been standardized by the American National Standards Institute (ANSI) working group X3J10 and International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC), ISO/IEC Joint Technical Committee 1 Subcommittee 22 Working Group 3. The Core APL language is specified in ISO 8485:1989, and the Extended APL language is specified in ISO/IEC 13751:2001.

Source of the article : Wikipedia



EmoticonEmoticon

 

Start typing and press Enter to search