THE BRYCE IS RIGHT!

Software for the finest computer – The Mind

Archive for the ‘Computers’ Category

NO, I DO NOT WANT WINDOWS 10

Posted by Tim Bryce on June 20, 2016

BRYCE ON TECHNOLOGY

– Why do some look better than others?

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

For the last few months I have been bombarded with messages from Microsoft asking, no begging me, to upgrade to Windows 10, the latest version of their operating system. Frankly, I am not interested. I am staying with Windows 7, both at home and at the office, primarily because we still have a couple of DOS based programs we regularly use and there is no effective support for them on Windows 10 (or Windows 8 for that matter).

Day after day, we see little pop-up boxes asking us to upgrade which we regularly ignore. I learned a long time ago to never use anything new from Microsoft as it is way too buggy and not properly tested. Microsoft is one of those techie companies who relies on its customers to test their products. This is like asking the customers of an automotive company to test their products. Er, ah, no thanks. Frankly, I do not believe Microsoft knows how to competently test their products themselves. This is why I have never thought of MS as “state of the art.”

The company was so persistent for me to upgrade to Windows 10 (or is it downgrade?), that they even installed it on my home computer over night. In the morning, I awoke to a new screen welcoming me to Windows 10. I began to panic as I knew I didn’t want it, yet they had the audacity to install it without my permission. Fortunately, as I started to go through the first few steps of using it, they asked me if I accepted the terms and conditions for using the product, for which I pressed the DECLINE button. I then heard my computer groan, or perhaps it was Bill Gates himself, as it removed Windows 10 and returned me to Windows 7. Wow, that was a close one.

I have some friends who, not knowing any better, accidentally accepted the terms and conditions, and now appear to be stuck with Windows 10 which they simply abhor.

Fortunately, after sniffing around on the Internet, I happened to find a way to return your computer back to Windows 7 and 8, and, No, it wasn’t authored by Microsoft. Evidently you have one month to reverse the process. After that, you are stuck with Windows 10. Click HERE. There is also a video on YouTube to walk you through the process.

Frankly, it is very disconcerting Microsoft pushes this upgrade down the throat of customers who do not want it. It’s intrusive and I wonder how legal it is to do so. You can nudge all you want, just do not push.

Also published with News Talk Florida.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2015 by Tim Bryce. All rights reserved.

NEXT UP:  THE MAIN EVENT: THE TRUMP/CLINTON DEBATES – Hold on to your seats, you won’t want to miss them.

LAST TIME:  SIGNATURES  – Why do some look better than others?

Listen to Tim on WZIG-FM (104.1) in Palm Harbor,FL; KIT-AM (1280) in Yakima, Washington “The Morning News” with hosts Dave Ettl & Lance Tormey (weekdays. 6:00-9:00am Pacific); and WWBA-AM (News Talk Florida 820). Or tune-in to Tim’s channel on YouTube.

Advertisement

Posted in Computers, Technology | Tagged: , , , , , | 6 Comments »

WHAT EVER HAPPENED TO UNIVAC?

Posted by Tim Bryce on October 20, 2014

BRYCE ON HISTORY

– Why it is necessary to learn industrial history.

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

Through my columns I occasionally write something related to American history. I do this because I believe young people are losing their sense of history and are doomed to repeat mistakes we’ve made in the past. The same is true in industrial history, in my case the computer field. To illustrate, a few years ago I inherited my father’s UNIVAC Zippo lighter. I always admired it; it was small, sleek, and had an impressive UNIVAC logo engraved on it. I believe he got it back in the early 1960’s. As an aside, my father was one of the first fifty computer programmers in the United States, starting back in 1954 when he worked on the UNIVAC I at the US Bureau of Census. I also have his original programming book from 1954 and template (and photos), along with some print wheels from the first high speed printer, a UNIVAC I magnetic tape (made of metal), and some plugboards. However, it was the small lighter he carried which I fancied.

Nonetheless, I was recently at a meeting where I met a gentleman, approximately 40 years of age, who is also actively engaged in the computer business. I pulled him aside and proudly showed him the lighter. He looked at it with a blank stare and said, “What is a UNIVAC?” I was thunderstruck by the comment. Even though it represented the first commercial computer, he had no idea of what it was, nor seemed to care.

It occurred to me there is no sense of industrial history anymore. Through my father and my own personal experiences, I have a deep sense of history for my craft, but I must be an anomaly. Some time ago I wrote a paper entitled, “A Short History of Systems Development,” in the hopes of recording some of these historical milestones. It was well received, but I fear students are not learning such lessons from the college professors, or simply do not care.

I also recently met with some high school students interested in a career in computing. Their sense of history only goes as far back as Microsoft, Apple, and the Internet. Most were knowledgeable with the C and C++ programming languages, but little else. I then asked them if they knew what a 4GL was; a handful knew. I next asked what a 1GL, 2GL, or 3GL was. None knew. I explained it as:

1GL – First Generation Language – programming in machine language.
2GL – Second Generation Language – Assembly language.
3GL – Third Generation Language – procedural languages such as COBOL, Fortran, PL/1, C and C++.
4GL – Fourth Generation Language – interpreters/specification driven tools to produce code.

I then went into a dissertation of how and why these languages were invented. As an aside, the 3GL, was based on a manual procedural language derived from Broadway scripts (invented by Les Matthies, “The Dean of Systems”). When the Navy’s Admiral Grace Hopper developed COBOL (COmmon Business Oriented Language), she used Les Matthies’ “playscript” technique and automated it. COBOL was then emulated and simplified by ensuing programming languages. We also discussed the premise behind the JAVA language (“Write once, run everywhere”).

I next asked if they were familiar with the various DBMS models (Data Base Management Systems). Again, none knew anything about them. I then went on to explain the differences between the Hierarchical Model (e.g., IBM’s IMS and D-BOMP), the CODASYL Network Model (e.g., IDS, TOTAL, IDMS, and ADABAS), the Relational model (used by most computers today, e.g., DB2 and ORACLE), and the Object Oriented Model which is slowly gaining in acceptance. More importantly, I explained why the DBMS was invented. A large amount of the credit goes to Charles Bachman of GE/Honeywell where he invented IDS to implement Bill of Materials processing (BOMP) in manufacturing.

My point to the young students, and to you, is that it is important to study the past so we do not replicate the same mistakes. This is what craftsmen do regardless of the industry. Regretfully, I see little of this in business anymore, particularly in the computer field. It is difficult to innovate and invent without a sense of such history. Considerable time and effort is wasted as a result.

As to UNIVAC itself (UNIVersal Automatic Computer), the computer was invented by the Eckert-Mauchly Computer Corporation which became a division of the Remington Rand corporation. Remington was acquired by the Sperry Rand corporation and dubbed the computer division, Sperry Univac, then just UNIVAC. In 1986, the company merged with Burroughs Corporation, another maker of mainframe computers, to become UNISYS.

I think I will continue to carry my father’s UNIVAC lighter in case I run into more people involved with the computer business. It’s quite a conversational piece.

One last bit of trivia, who were the “BUNCH” competing with IBM in the mainframe wars of yesteryear? Answer: Burroughs, UNIVAC, NCR, CDC, and Honeywell. Where are they now?

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2014 by Tim Bryce. All rights reserved.

NEXT UP:  JOB CHECK, CHECK, CHECK, CHECK, CHECK… – Something for young people; describing the types of checks an employer will perform.

LAST TIME:  WORKING FOR GOONS

  – Making the work environment unbearable.

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Dave Ettl & Lance Tormey (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Business, Computers, History, Technology | Tagged: , , , , | 4 Comments »

ACCEPTING MEDIOCRITY IN COMPUTING

Posted by Tim Bryce on June 23, 2014

BRYCE ON TECHNOLOGY

– Just because you use Microsoft products, doesn’t mean you are “state of the art.”

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

Back in 1996, I helped organize a global effort to promote an IBM operating system for use on the PC; codenamed “Merlin,” it represented Release 4.0 of OS/2 Warp. For those of you ensconced with Microsoft products, there are alternatives to Windows, OS/2 being one. Originally introduced in 1989, OS/2 was a far superior operating system, and way ahead of its time. It offered a true object-oriented desktop, making use of a System Object Model (SOM) which allowed multiple programs to share data at the same time. It also had an easy to use and customizable Graphical User Interface (GUI), a sophisticated High Performance File System (HPFS), symmetric multiprocessing (SMP) support, crash protection, and much more. It could run DOS and Windows apps as well as native OS/2 programs. It was also the first operating system to support JAVA, offer speech recognition, multitasking/multithreading, and was Internet aware. It was an incredibly stable operating platform. After using OS/2 for a number of years, I had trouble adjusting to the Windows world as I found it to be a quantum leap backwards. Everything I took for granted with OS/2 was simply not there in Windows. There was only one problem with OS/2, IBM didn’t know how to market it and inexplicably backed down from Microsoft.

For one day in October 1996 (October 26th), tiny Palm Harbor, Florida was the center of the OS/2 universe. Knowing a storm was brewing between IBM and Microsoft, OS/2 users lept to the rescue in the form of a worldwide demonstration of OS/2 entitled, “Connect the World with Merlin.” (Click for MORE).

Merlin was the codename for the next major release of OS/2 (v4.0), the last issued by IBM. As a show of support, OS/2 users rallied around the product and put on a demonstration of the product at computer stores, Internet cafes, universities, and PC user group meetings. 28 countries participated in the event, all orchestrated by the product’s customers, not the vendor. This is the first time such an event was conducted in this manner, and perhaps the only one to do so. During the 24 hours of the event, volunteers met in our offices in Palm Harbor and communicated with OS/2 users around the globe using the Internet, cameras and native OS/2 software (NOTE: this was way before such things as Skype). I personally gave OS/2 presentations to consumers and students in Australia, Brazil, Norway, Sweden, the United Kingdom and of course throughout the United States. As each OS/2 site called in, we marked their location on a global map which was refreshed on the Internet. Dots appeared going from east to west as people followed the movement of the sun.

When the day was over, 165 sites had been contacted around the world, with over 1,000 volunteers participating in the event, not bad for a customer driven marketing event. IBM thanked us for our support and we garnered considerable publicity in the process, but IBM nevertheless abdicated the product over the next few years. Its loyal customers persevered though and went on to create an annual user conference entitled, “Warpstock,” thereby attaining cult status.

OS/2 may have been dropped by IBM, but it lives on as a hybrid product called “eComStation” (click for MORE), which is developed by IBM, Mensys, Serenity, and various third parties. There are still many proponents who understand the strength of the product and have no intention of sipping the MS Windows Kool-Aid. Even though OS/2 is far and away a better product, Microsoft was able to pound them into submission. The same is true for other products:

* Even though Lotus SmartSuite predated MS Office, it is Microsoft’s offering people are most familiar with. If you worked with Lotus SmartSuite though, you realize the deficiencies in MS Office; it is like night and day. Lotus was purchased by IBM who, again, botched the marketing of the product.

* Adobe “InDesign” and its predecessor, “Pagemaker,” were impressive tools for desktop publishing. Yet, it is MS Publisher (a component of MS Office) the public is more familiar with. Again, if you have used Adobe’s products, you realize the weaknesses of Microsoft’s offerings.

* The RealPlayer multimedia player predated MS Media Player. Further, Real’s peripheral products for recording and editing multimedia are vastly superior.

* The Google Chrome and Mozilla Firefox web browsers are vastly superior to MS Internet Explorer. Likewise, the Mozilla Thunderbird e-mail reader is more effective than MS Outlook, yet it is the latter which dominates market share.

There are many other examples, such as Intuit’s Quicken versus MS Money, and I could go on and on. The point is, if the consumer doesn’t know better, they will accept the status quo as “state of the art,” when, in reality, it is substantially behind it. Products like OS/2, Lotus, RealPlayer, etc. are cleaner, simpler and more easy to use, not to mention more stable. Nonetheless, it is marketing which dictates the state of the art, not technology.

Consider this, I still have two OS/2 Warp computers and they haven’t crashed in decades, that’s right, decades. Can you say the same for your MS Windows machines?

I chuckle when I hear someone say Bill Gates was a technical genius. Someone is taking it in the arm when they say things like this. A technical genius? Hardly. A marketing genius? Definitely.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2014 by Tim Bryce. All rights reserved.

NEXT UP:  HOW OBAMA IS UNDERMINING DEMOCRATS – With friends like this, who needs enemies?

LAST TIME:  LIFE IS UNFAIR  – Murphy’s Laws have a tendency of upsetting us.

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Dave Ettl & Lance Tormey (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Computers, Software, Technology | Tagged: , , , , | 6 Comments »

A SHORT HISTORY OF SYSTEMS DEVELOPMENT

Posted by Tim Bryce on February 10, 2014

BRYCE ON SYSTEMS

– Which came first, systems or the computer?

NOTE: People today believe information systems began with the advent of the computer. Hardly. Computers are but the latest method of implementation. Systems are actually as old as business, which obviously represents centuries. Herein I try to set the record straight based on my nearly 40 years in the industry. Hopefully, this will be interesting to system practitioners as well as those who use systems on a daily basis.

“If they do not have an appreciation of whence we came, I doubt they will have an appreciation of where we should be going.”
– Bryce’s Law

INTRODUCTION

I always find it amusing when I tell a young person in this industry that I worked with punch cards and plastic templates years ago. Its kind of the same dumbfounded look I get from my kids when I tell them we used to watch black and white television with three channels, no remote control, and station signoffs at midnight. It has been my observation that our younger workers do not have a sense of history; this is particularly apparent in the systems world. If they do not have an appreciation of whence we came, I doubt they will have an appreciation of where we should be going. Consequently, I have assembled the following chronology of events in the hopes this will provide some insight as to how the systems industry has evolved to its current state.

I’m sure I could turn this into a lengthy dissertation but, instead, I will try to be brief and to the point. Further, the following will have little concern for academic developments but rather how systems have been implemented in practice in the corporate world.

PRE-1950’S – “SYSTEMS AND PROCEDURES”

Perhaps the biggest revelation to our younger readers regarding this period will be that there was any form of systems prior to the advent of the computer. In fact, “Systems and Procedures” Departments predated the computer by several years. Such departments would be concerned with the design of major business processes using “work measurement” and “work simplification” techniques as derived from Industrial Engineering. Such processes were carefully designed using grid diagrams and flowcharts. There was great precision in the design of forms to record data, filing systems to manage paperwork, and the use of summary reports to act as control points in systems. For example, spreadsheets have been extensively used for many years prior to the introduction of Lotus 1-2-3 or MS Excel. There was also considerable attention given to human behavior during the business process (the precursor to “ergonomics”).

Systems were initially implemented by paper and pencil using ledgers, journals (logs), indexes, and spreadsheets. We have always had some interesting filing systems, everything from cards and folders, to storage cabinets.

Perhaps the earliest mechanical device was the ancient abacus used for simple math (which is still used even to this day). The late 1800’s saw the advent of cash registers and adding machines as popularized by such companies as NCR in Dayton, Ohio under John Patterson who also introduced sweeping changes in terms of dress and business conduct. This was adopted by Thomas Watson, Sr. who worked for many years at NCR and carried forward these practices to IBM and the rest of the corporate world. Also, Burroughs was a major player in the early adding machine industry.

The first typewriters were also introduced in the late 1800’s which had a tremendous effect on correspondence and order processing. This was led primarily by Remington Arms (later to become Remington Rand).

In the early 1900’s, tabulating equipment was introduced to support such things as census counting. This was then widely adopted by corporate America. Occasionally you will run into old-timers who can describe how they could program such machines using plug boards. Punch card sorters were added as an adjunct to tabulating equipment.

As a footnote, most of what IBM’s Watson learned about business was from his early days at NCR. However, he had a falling out with Patterson who fired him. As a small bit of trivia, after Watson died, he was buried in Dayton on a hilltop overlooking NCR headquarters, the company he couldn’t conquer.

During World War II, both the U.S. military and industrial complex relied heavily on manually implemented systems. We did it so well that many people, including the Japanese, contend it gave the Allies a competitive edge during the war.

The lesson here, therefore, is that manually implemented systems have been with us long before the computer and are still with us today. To give you a sense of history in this regard, consider one of our more popular Bryce’s Laws:

“The first on-line, real-time, interactive, data base system was double-entry bookkeeping which was developed by the merchants of Venice in 1200 A.D.”

One major development in this area was the work of Leslie “Les” Matthies, the legendary Dean of Systems. Les graduated from the University of California at Berkeley during the Depression with a degree in Journalism. Being a writer, he tried his hand at writing Broadway plays. But work was hard to come by during this period and when World War II broke out, Les was recruited by an aircraft manufacturer in the midwest to systematize the production of aircraft. Relying on his experience as a writer, he devised the “Playscript” technique for writing procedures. Basically, Les wrote a procedure like a script to a play; there was a section to identify the procedure along with its purpose; a “Setup” section to identify the forms and files to be used during it; and an “Operations/Instructions” section which described the “actors” to perform the tasks using verbs and nouns to properly state each operation. He even went so far as to devise rules for writing “If” statements.

For details on “Playscript,” see – “The Language of Systems”

“Playscript” became a powerful procedure writing language and was used extensively throughout the world. It is still an excellent way to write procedures today. Ironically, Les did not know what a profound effect his technique would have later on in the development of computer programs.

1950’S – INTRODUCTION OF THE COMPUTER

Yes, I am aware that the ENIAC was developed for the military at the end of World War II. More importantly, the UNIVAC I (UNIVversal Automatic Computer) was introduced in 1951 by J. Presper Eckert and John Mauchly. The UNIVAC I was a mammoth machine that was originally developed for the U.S. Bureau of the Census. Corporate America took notice of the computer and companies such as DuPont in Wilmington, Delaware began to lineup to experiment with it for commercial purposes. The Remington Rand Corporation sponsored the project, but the company’s focus and name eventually changed to “UNIVAC” (today it is referred to as “UNISYS,” representing a merger of UNIVAC with Burroughs).

The UNIVAC I offered a sophistication unmatched by other manufacturers, most notably IBM’s Mach I tabulating equipment. This caused IBM to invent the 701 and its 700 series. Other manufacturers quickly joined the fray and computing began to proliferate. Although UNIVAC was the pioneer in this regard, they quickly lost market share due to the marketing muscle of IBM. For quite some time the industry was referred to as “IBM & the BUNCH” (Burroughs, UNIVAC, NCR, CDC, and Honeywell).

Programming the early machines was difficult as it was performed in a seemingly cryptic Machine Language (the first generation language). This eventually gave way to the Assembly Language (the second generation language) which was easier to read and understand. Regardless, many of the utilities we take for granted today (e.g., sorts and merges) simply were not available and had to be developed. In other words, programming was a laborious task during this period.

Recognizing both the limitations and potential of the computer, the 1950’s represented the age of experimentation for corporate America. Here, the emphasis was not on implementing major systems through the computer, but rather to develop an assortment of programs to test the machine as a viable product. As such, programmers were considered odd characters who maintained “the black box,” and were not yet considered a part of the mainstream of systems development. The “Systems and Procedures Departments” still represented the lion’s share of systems work in corporate America, with an occasional foray to investigate the use of the computer. The computer people were segregated into “computer departments” (later to be known as “EDP” or “Data Processing” departments).

1960’s – MANAGEMENT INFORMATION SYSTEMS

Competition between computer manufacturers heated up during this decade, resulting in improvements in speed, capacity, and capabilities. Of importance here was the introduction of the much touted IBM 360 (the number was selected to denote it was a comprehensive solution – 360 degrees). Other computer vendors offered products with comparable performance, if not more so, but the IBM 360 was widely adopted by corporate America.

The programming of computers was still a difficult task and, consequentially, Procedural Languages were introduced (the third generation languages). In actuality, these languages got their start in the late 1950’s, but the proliferation of computers in the 1960’s triggered the adoption of procedural languages such as COBOL, FORTRAN, and PL/1. Interestingly, these languages were patterned after Les Matthies’ “Playscript” technique which made active use of verbs, nouns, and “if” statements.

The intent of the Procedural Languages was twofold: to simplify programming by using more English-like languages, and; to create universal languages that would cross hardware boundaries. The first goal was achieved, the second was not. If the languages were truly universal, it would mean that software would be portable across all hardware configurations. Manufacturers saw this as a threat; making software truly portable made the selection of hardware irrelevant and, conceivably, customers could migrate away from computer vendors. In order to avoid this, small nuances were introduced to the compilers for the Procedural Languages thereby negating the concept of portability. This issue would be ignored for many years until the advent of the Java programming language.

The 1960’s also saw the introduction of the Data Base Management System (DBMS). Such products were originally designed as file access methods for Bill of Materials Processing (BOMP) as used in manufacturing. The “DBMS” designation actually came afterwards. Early pioneers in this area included Charlie Bachman of G.E. with his Integrated Data Store (IDS) which primarily operated under Honeywell GCOS configurations; Tom Richley of Cincom Systems developed TOTAL for Champion Paper, and; IBM’s BOMP and DBOMP products. In 1969, IBM introduced IMS which became their flagship DBMS product for several years.

With the exception of IMS, the early DBMS offerings were based on a network model which performed chain-processing. IMS, on the other hand, was a hierarchical model involving tree-processing.

Realizing that programming and data access was becoming easier and computer performance being enhanced, companies now wanted to capitalize on this technology. As a result, corporate America embarked on the era of “Management Information Systems” (MIS) which were large systems aimed at automating business processes across the enterprise. These were major system development efforts that challenged both management and technical expertise.

It was the MIS that married “Systems and Procedures” departments with computing/EDP departments and transformed the combined organization into the “MIS” department. This was a major milestone in the history of systems. The systems people had to learn about computer technology and the programmers had to learn about business systems.

Recognizing that common data elements were used to produce the various reports produced from an MIS, it started to become obvious that data should be shared and reused in order to eliminate redundancy, and to promote system integration and consistent data results. Consequently, Data Management (DM) organizations were started, the first being the Quaker Oats Company in Chicago, Illinois in 1965. The original DM organizations were patterned after Inventory Control Departments where the various components were uniquely identified, shared and cross-referenced. To assist in this regard, such organizations made use of the emerging DBMS technology. Unfortunately, many DM organizations lost sight of their original charter and, instead, became obsessed with the DBMS. Data as used and maintained outside of the computer was erroneously considered irrelevant. Even worse, the DBMS was used as nothing more than an elegant access method by programmers. Consequently, data redundancy plagued systems almost immediately and the opportunity to share and reuse data was lost. This is a serious problem that persists in companies to this day.

1970’s – AWAKENING

Although the MIS movement was noble and ambitious in intent, it floundered due to the size and complexity of the task at hand. Many MIS projects suffered from false starts and botched implementations. This resulted in a period where a series of new methods, tools and techniques were introduced to reign in these huge development efforts.

The first was the introduction of the “methodology” which provided a road map or handbook on how to successfully implement systems development projects. This was pioneered by MBA with its “PRIDE” methodology in 1971. Although the forte of “PRIDE” was how to build systems, it was initially used for nothing more than documentation and as a means to manage projects. Following “PRIDE” was John Toellner’s Spectrum I methodology and SDM/70 from Atlantic Software. Several CPA based methodologies followed thereafter.

Also during this time, mainframe based Project Management Systems were coming into vogue including Nichols N5500, PAC from International Systems, and PC/70 from Atlantic Software.

The early methodologies and Project Management Systems give evidence of the orientation of systems departments of that time: a heavy emphasis on Project Management. Unfortunately, it was a fallacy that Project Management was the problem; instead people simply didn’t know how to design and build systems in a uniform manner. As companies eventually learned, Project Management is useless without a clear road map for how to build something.

In the mid-to-late 1970’s several papers and books were published on how to productively design software thus marking the beginning of the “Structured Programming” movement. This was a large body of work that included such programming luminaries as Barry Boehm, Frederick P. Brooks, Larry Constantine, Tom DeMarco, Edsger Dijkstra, Chris Gane, Michael A. Jackson, Donald E. Knuth, Glenford J. Myers , Trish Sarson, Jean Dominique Warnier, Generald M. Weinberg, Ed Yourdon, as well as many others. Although their techniques were found useful for developing software, it led to confusion in the field differentiating between systems and software. To many, they were synonymous. In reality, they are not. Software is subordinate to systems, but the growing emphasis on programming was causing a change in perspective.

The only way systems communicate internally or externally to other systems is through shared data; it is the cohesive bond that holds systems (and software) together. This resulted in the introduction of Data Dictionary technology. Again, this was pioneered by MBA with its “PRIDE” methodology (which included a manually implemented Data Dictionary) and later with its “PRIDE”-LOGIK product in 1974. This was followed by Synergetics’ Data Catalogue, Data Manager from Management Software Products (MSP), and Lexicon by Arthur Andersen & Company.

The intent of the Data Dictionaries was to uniquely identify and track where data was used in a company’s systems. They included features for maintaining documentation, impact analysis (to allow the studying of a proposed change), and redundancy checks. “PRIDE”-LOGIK had the added nuance of cataloging all of the systems components, thereby making it an invaluable aid for design and documentation purposes.

The Data Dictionary was also a valuable tool for controlling DBMS products and, as such, several adjunct products were introduced, such as UCC-10, DB/DC Data Dictionary, and the Integrated Data Dictionary (IDD) from Cullinet. Unlike the other general purpose Data Dictionaries, these products were limited to the confines of the DBMS and didn’t effectively track data outside of their scope.

DBMS packages proliferated during this period with many new products being introduced including ADABAS, Image, Model 204, and IDMS from Cullinet (which was originally produced at BF Goodrich). All were based on the network-model for file access which was finally adopted as an industry standard (CODASYl).

There were a few other notable innovations introduced, including IBM’s Business Systems Planning (BSP) which attempted to devise a plan for the types of systems a company needed to operate. Several other comparable offerings were introduced shortly thereafter. Interestingly, many companies invested heavily in developing such systems plans, yet very few actually implemented them.

Program Generators were also introduced during this period. This included report writers that could interpret data and became a natural part of the repertoire of DBMS products. It also included products that could generate program source code (COBOL predominantly) from specifications. This included such products as System-80 (Phoenix Systems), GENASYS (Generation Sciences), and JASPOL (J-Sys of Japan), to mention but a few.

MBA also introduced a generator of its own in 1979 – a Systems generator initially named ADF (Automated Design Facility) which could automatically design whole systems, complete with an integrated data base. Based on information requirements submitted by a Systems Analyst, ADF interacted with the “PRIDE”-LOGIK Data Dictionary to design new systems and, where appropriate modify existing systems. Because of its link to LOGIK, ADF emphasized the need to share and reuse information resources. Not only was it useful as a design tool but it was a convenient tool for documenting existing systems. The only drawback to ADF was that the mindset of the industry was shifting from systems to software. Consequently, program generators captured the imagination of the industry as opposed to ADF.

The increase in computer horsepower, coupled with new programming tools and techniques, caused a shift in perspective in MIS organizations. Now, such departments became dominated by programmers, not systems people. It was here that the job titles “Systems Analyst” and “Programmer” were married to form a new title of “Programmer/Analyst” with the emphasis being on programming and not on front-end systems design. Many managers falsely believed that developers were not being productive unless they were programming. Instead of “Ready, Aim, Fire,” the trend became “Fire, Aim, Ready.”

Data Management organizations floundered during this period with the exception of Data Base Administrators (DBA’s) who were considered the handmaidens of the DBMS.

The proliferation of software during this decade was so great that it gave rise to the packaged software industry. This went far beyond computer utilities and programming tools. It included whole systems for banking, insurance and manufacturing. As a result, companies were inclined to purchase and install these systems as opposed to reinventing the wheel. Among their drawbacks though was that they normally required tailoring to satisfy the customer’s needs which represented modification to the program source code. Further, the customer’s data requirements had to be considered to assure there were no conflicts in how the customer used and assigned data. After the package had been installed, the customer was faced with the ongoing problem of modifying and enhancing the system to suit their ever-changing needs.

1980’s – THE TOOL-ORIENTED APPROACH

As big iron grew during the 1960’s and 1970’s, computer manufacturers identified the need for smaller computers to be used by small to medium-sized businesses. In the 1970’s, people were skeptical of their usefulness but by the 1980’s their power and sophistication caused the “mini” computer to gain in popularity as either a general purpose business machine or dedicated to a specific system. Among the most popular of the “mini” computers were:

IBM’s System 36/38 series (which led to the AS/400)
DEC PDP Series (which gave way to the DEC VAX/VMS)
Hewlett-Packard’s HP-3000 series with MPE
Data General Eclipse series with AOS
PRIME

The competition was fierce in the “mini” market which resulted in considerable product improvements and better value to the customer. Instrumental to the success of the mini was the adoption of UNIX as developed by Bell Labs, a powerful multi-user, multitasking operating system that eventually was adopted by most, if not all, mini manufacturers.

But the major development in computer hardware was not the mainframe, nor the mini; it was the “micro” computer which was first popularized by Apple in the late 1970’s. IBM countered with the its Personal Computer (PC) in the early 1980’s. At first, the micro was considered nothing more than a curiosity but it quickly gained in popularity due to its inexpensive cost, and a variety of “apps” for word processing, spreadsheets, graphics, and desktop publishing. This caught on like wildfire as micros spread through corporate desktops like the plague.

By the mid-1980’s the “micro” (most notably the PC) had gained in power and sophistication. So much so, that a series of graphical based products were used for software development in support of the Structured Programming movement of the 1970’s. Such tools were dubbed “CASE” (Computer Aided Software Engineering) which allowed developers to draw their favorite software diagramming technique without pencil and paper. Early CASE pioneers included Index Technology, Knowledgeware, Visible Systems, Texas Instruments, and Nastec, as well as many others. CASE tools took the industry by storm with just about every MIS organization purchasing a copy either for experimental use or for full application development. As popular as the tools were initially, there is little evidence they produced any major systems but, instead, helped in the design of a single program.

Recognizing the potential of the various CASE tools, IBM in the late 1980’s devised an integrated development environment that included IBM’s products as well as third parties, and entitled it “AD/Cycle.” However, IBM quickly ran into problems with the third party vendors in terms of agreeing on technical standards that would enable an integrated environment. Consequently, the product ran aground not long after it was launched. In fact, the prosperity of the CASE market was short-lived as customers failed to realize the savings and productivity benefits as touted by the vendors. By the early 1990’s, the CASE market was in sharp decline.

Instead, companies turned to Programmer Workbenches which included an all-in-one set of basic tools for programming, such as editing, testing, and debugging. Microsoft and Micro Focus did particularly well in offering such products.

Data Base Management Systems also took a noticeable turn in the 1980’s with the advent of “relational” products involving tables and keys. The concept of the “relational” model was originally developed by IBM Fellow and mathematician Edgar (Ted) Codd in a paper from 1970. The concept of a relational DBMS was superior to the earlier network and hierarchical models in terms of ease of use. The problem resided in the amount of computer horsepower needed to make it work, a problem that was overcome by the 1980’s. As a result. new DBMS products such as Oracle and Ingres were introduced which quickly overtook their older competitors. There was an initial effort to convert DBMS mainstays such as TOTAL, ADABAS, and IDMS into relational products, but it was too little, too late. As for IBM, they simply re-labeled their flagship product, IMS, as a “transaction processor” and introduced a totally new offering, DB2, which quickly dominated the DBMS mainframe market.

Program generators continued to do well during the 1980’s but it was during this period that 4GL’s (fourth generation languages) were introduced to expedite programming. The 4GL was a natural extension of the DBMS and provided a convenient means to develop programs to interpret data in the data base.

Another development worth noting is the evolution of the Data Dictionary into “Repositories” (also referred to as “Encyclopedias”) used to store the descriptions of all of an organization’s information resources. One of the motivating factors behind this was IBM (for AD/Cycle) who realized they needed some sort of cohesive bond for the various CASE tools to interface. This is another area pioneered by MBA who introduced their “PRIDE”-Enterprise Engineering Methodology (EEM) to study a business and formulate an Enterprise Information Strategy, and their “PRIDE”-Data Base Engineering Methodology (DBEM) to develop the corporate data base, both logically and physically. To implement these new methodologies, their “PRIDE”-LOGIK Dictionary was expanded to include business models, and data models. By doing so, MBA renamed “PRIDE”-LOGIK the “PRIDE”-IRM (Information Resource Manager) which complemented their concept of Information Resource Management.

In terms of the MIS infrastructure, two noteworthy changes occurred; first was the introduction of the Chief Information Officer (CIO) as first described in the popular book, “Information Systems Management In Practice” (McNurlin, Sprague) in January 1986. Basically, the MIS Director is elevated to a higher management level where, theoretically, he/she is operating on the same level as the Chief Operating Officer (COO), and Chief Financial Officer (CFO) for a company. In reality, this has never truly happened and, in many cases, the title “CIO” is nothing more than a change in name, not in stature. The second change is the change in job title of “Programmer” to “Software Engineer.” Again, we are primarily talking about semantics. True, many of the programmers of the 1980’s studied Structured Programming, but very few truly understood the nature of engineering as it applies to software, most are just glorified coders. Nonetheless, the “Software Engineer” title is still actively used today. In contrast, the last of the true “Systems Analysts” slowly disappeared. Here too is evidence of the change of focus from systems to software.

During the 1980’s we also saw the emergence of MBA’s graduating from the business schools and working their way into the corporate landscape. Although they didn’t have an immediate impact on the systems world, they had a dramatic effect on the corporate psyche. Their work resulted in severe corporate cutbacks, downsizing, and outsourcing. This changed the corporate mindset to think short-term as opposed to long-term. Following this, companies shied away from major systems projects (such as the MIS projects of the 1960’s) and were content tackling smaller programmer assignments, thus the term “app” was coined to describe a single program application.

Interestingly, a “quality” movement flourished in the 1980’s based on the works of W. Edwards Deming and Joseph M. Juran who pioneered quality control principles in the early part of the 20th century. Unfortunately, their early work was unappreciated in America and, consequently, they applied their talents to help rebuild the industrial complex of postwar Japan. It was only late in their lives did they receive the recognition of their work in the United States (after Japan became an economic powerhouse). Another influential factor was the introduction of the ISO 9000 standard for quality management which was originally devised by the British and later adopted as an international standard. Little attention would probably have been paid to ISO 9000 if it weren’t for the fact that European businesses started to demand compliance in order to conduct business with their companies.

Nevertheless, these factors resulted in a reorientation of American businesses to think in terms of developing quality products which, inevitably, affected how systems and software were produced. The real impact of the quality movement though wouldn’t be felt in the systems world until the next decade.

To summarize the 1980’s from a systems development perspective, the focus shifted away from major systems to smaller programming assignments which were implemented using newly devised CASE tools. This fostered a “tool-oriented approach” to development whereby companies spent considerably on the latest programming tools but little on management and upfront systems work. In other words, they bought into the vendor’s claims of improved programmer productivity through the use of tools. Unfortunately, it resulted in patchwork systems that required more time in maintenance as opposed to modifying or improving systems. “Fire fighting” thereby became the normal mode of operation in development.

1990’s – REDISCOVERY

As the PC gained in stature, networking became very important to companies so that workers could collaborate and communicate on a common level. Local Area Networks (LAN) and Wide Area Networks (WAN) seemed to spring-up overnight. As the PC’s power and capacity grew, it became obvious that companies no longer needed the burden of mainframes and minis. Instead, dedicated machines were developed to control and share computer files, hence the birth of “client/server computing” where client computers on a network interacted with file servers. This did not completely negate the need for mainframes and minis (which were also used as file servers), but it did have a noticeable impact on sales. Companies still needed mainframes to process voluminous transactions and extensive number-crunching, but the trend was to move away from big iron.

Thanks to the small size of the PC, companies no longer required a big room to maintain the computer. Instead, computers were kept in closets and under desks. This became so pervasive that companies no longer knew where their computer rooms were anymore. In a way, the spread of computers and networks closely resembled the nervous system of the human body.

One of the key elements that made this all possible was the introduction of Intel’s 30386 (or “386”) chip which allowed 32-bit processing. To effectively use this new technology, new operating systems had to be introduced, the first being IBM’s OS/2 in the late 1980’s. OS/2 provided such things as virtual memory, multitasking and multithreading, network connectivity, crash-protection, a new High Performance File System, and a slick object oriented desktop. Frankly, there was nothing else out there that could match it. Unfortunately, Microsoft bullied its way past OS/2 with Windows 95 & NT. By the end of the 1990’s, OS/2 was all but forgotten by its vendor, IBM. Nevertheless, it was the advent of 32-bit computing that truly made client/server computing a reality.

Another major milestone during this decade was the adoption of the Internet by corporate America. The Internet actually began in the late 1960’s under the Department of Defense and was later opened to other government and academic bodies. But it wasn’t until the 1990’s that companies started to appreciate the Internet as a communications and marketing medium.

The first web browser was developed by Tim Berners-Lee in 1990 which led to the World Wide Web protocol on the Internet. Early web browsers included Mosaic, Netscape Navigator, and Microsoft’s Internet Explorer, among others. The beauty of the Internet was that all computers could now access the Internet regardless of the operating system, making it a truly universal approach to accessing data. To write a web page, a simple tag language was devised, Hyper Text Markup Language (HTML), which was compiled at time of request to display the web page. HTML was nice for developing simple static web pages (not much interaction, just simply view the web page). Developers then invented new techniques to make a web page more dynamic thereby allowing people to input data and interact with files, which ultimately allowed for the merchandising of products over the Internet.

Wanting to do something more sophisticated through the web browser, Sun Microsystems developed the Java programming language in 1995. Java was a universal programming language that could run under any operating system. Their mantra was “Write once, run anywhere.” This was a radical departure from programming in the past where it was necessary to recompile programs to suit the peculiarities of a particular operating system. Basically, Java made the operating system irrelevant, much to Microsoft’s chagrin. Further, Java could be used in small pocket devices as well as in the new generation of computers powering automobiles. This did not sit well with Microsoft who ultimately fought the propagation of Java.

By the 1990’s the Structured Programming movement had fizzled out. Instead, “Object Oriented Programming” (OOP) gained in popularity. The concept of OOP was to develop bundles of code to model real-world entities such as customers, products, and transactions. OOP had a profound effect on Java as well as the C++ programming language.

During this time, source code generators faded from view. True, companies were still using report writers and 4GL’s, but the emphasis turned to “Visual Programming” which were programming workbenches with screen painting tools to layout inputs and outputs.

The Relational DBMS movement was still in high gear, but the use of Repositories and Data Dictionaries dropped off noticeably. Of interest though was the introduction of “Object Oriented Data Base Management System” (OODBMS) technology. Like OOP, data was organized in a DBMS according to real-world entities. Regardless, Relational DBMS dominated the field.

Also during this decade “Data Mining” became popular whereby companies were provided tools to harvest data from their DBMS. This effort was basically an admission that companies should learn to live with data redundancy and not be concerned with developing a managed data base environment.

Because of the radical changes in computer hardware and software, companies became concerned with their aging “legacy” systems as developed over the last thirty years. To migrate to this new technology, a movement was created called “Business Process Re-engineering” (BPR). This was encouraging in the sense that companies were starting to think again in terms of overall business systems as opposed to just programs. I’m not sure I agree with the use of the term “Re-engineering” though; this assumes that something was engineered in the first place (which was hardly the case in these older systems).

Nonetheless, CASE-like tools were introduced to define business processes. Suddenly, companies were talking about such things as “work flows,” “ergonomics,” and “flowcharts,” topics that had not been discussed for twenty years during the frenzy of the Structured Programming movement. Ultimately, this all led to the rediscovery of systems analysis; that there was more to systems than just software. But by this time, all of the older corporate Systems Analysts had either retired or been put out to pasture, leaving a void in systems knowledge. Consequently, the industry started to relearn the systems theory, with a lot of missteps along the way.

Companies at this time were still struggling with devising a suitable development environment. Most were content with just maintaining their current systems in anticipation of the pending Y2K (Year 2000) problem (where date fields were to change from 19XX to 20XX which could potentially shutdown companies). However, a few companies began to consider how to apply more scientific principles to the production of systems. Since people were already talking about “Software Engineering,” why not apply engineering/manufacturing principles to the development of total systems?

Back in the early 1980’s, Japan’s Ministry of International Trade & Industry (MITI) coordinated a handful of Japanese computer manufacturers in establishing a special environment for producing system software, such as operating systems and compilers. This effort came to be known as Japanese “Software Factories” which captured the imagination of the industry. Although the experiment ended with mixed results, they discovered organization and discipline could dramatically improve productivity.

Why the experiment? Primarily because the Japanese recognized there are fundamentally two approaches to manufacturing anything: “one at a time” or mass production. Both are consistent approaches that can produce a high quality product. The difference resides in the fact that mass production offers increased volume at lower costs. In addition, workers can be easily trained and put into production. On the other hand, the “one at a time” approach is slower and usually has higher costs. It requires workers to be intimate with all aspects of the product.

MBA took it a step further by introducing their concept of an “Information Factory” in the early 1990’s. The Information Factory was a comprehensive development environment which implemented MBA’s concept of Information Resource Management. Basically, they drew an analogy between developing systems to an engineering/manufacturing facility, complete with assembly lines, materials management and production control. These concepts were proven effective in companies throughout Japan, most notably Japan’s BEST project, which was sponsored by the Ministry of Finance. As background, the ministry wanted to leapfrog the west in terms of banking systems. To do so, they assembled a team of over 200 analysts and programmers from four of the top trust banks in Japan; Yasuda Trust & Banking, Mitsubishi Trust & Banking, Nippon Trust & Banking, and Chuo Trust & Banking. By implementing MBA’s concepts they were able to deliver over 70 major integrated systems in less than three years. Further, because they had control over their information resources using a materials management philosophy, the Y2K problem never surfaced.

In terms of infrastructure, development organizations essentially went unchanged with a CIO at the top of the pyramid and supported by Software Engineers and DBA’s. But there was one slight difference, instead of being called an MIS or IS department, the organization was now referred to as “IT” (Information Technology). Here again, the name hints at the direction most organizations were taking.

Finally, the 1990’s marked a change in the physical appearance of the work force. Formal suit and ties gave way to casual Polo shirts and Docker pants. At first, casual attire was only allowed on certain days (such as Fridays), but it eventually became the normal mode of dress. Unfortunately, many people abused the privilege and dressed slovenly for work. This had a subtle but noticeable effect on work habits, including how we build systems.

2000’s – GADGETS

We are now past the halfway point in this decade and there is nothing of substance to report in terms of computer hardware, other than our machines have gotten faster, smaller, with even more capacity. Perhaps the biggest innovation in this regard is the wide variety of “gadgets” that have been introduced, all of which interface with the PC, including: Personal Digital Assistants (PDA’s), iPods, MP3 players, digital cameras, portable CD/DVD players (and burners), cell phones, PS2 and XBox game players. These devices are aimed at either communications or entertainment, giving us greater mobility, yet making us a bit dysfunctional socially. All of this means the computer has become an integral part of our lives, not just at work but at home as well.

Shortly after taking the reigns of IBM in 2003, CEO Sam Palmisano introduced “On-Demand Computing” as the company’s thrust for the years ahead and, inevitably, it will mark his legacy. The concept as described by Palmisano was simple, treat computing like a public utility whereby a company can draw upon IBM for computing resources as required. “On-Demand Computing” made a nice catch-phrase and was quickly picked up by the press, but many people were at a loss as to what it was all about. Some of the early developments resulting from IBM’s “e-Business On Demand” research included balancing the load on file servers, which makes sense. But IBM is carrying the analogy perhaps too far by stressing that “on demand” is the manner by which companies should run in the future. Basically, the theory suggests we abandon capacity planning and rely on outside vendors to save the day. Further, it implies computers supersede the business systems they are suppose to serve. Instead of understanding the systems which runs a business, just throw as much computer resources as you need to solve a problem. This is like putting the cart before the horse.

The “on-demand” movement has evolved into “Service Oriented Architectures” (SOA) where vendors are introducing “on-demand” applications that will take care of such tasks as payroll, marketing, etc. through the Internet. Again, it all sounds nice, but as far as I can see, this is essentially no different than service bureaus like ADP who for years provided such processing facilities. Now, companies are being asked to swap out their internal programs for third party products. I fail to see how this is different than buying any other packaged solution, other than an outsider will be taking care of your software.

The need to build software faster has reached a feverish pitch. So much so, full-bodied development methodologies have been abandoned in favor of what is called “Agile” or “Extreme Programming” which are basically quick and dirty methods for writing software using power programming tools. To their credit, those touting such approaches recognize this is limited to software (not total systems) and is not a substitute for a comprehensive methodology. Agile/Extreme Programming is gaining considerable attention in the press.

Next, we come to “Enterprise Architecture” which is derived from a paper written by IBM’s John A. Zachman who observed that it was possible to apply architectural principles to the development of systems. This is closely related to consultants who extoll the virtues of capturing “business rules” which is essentially a refinement of the Entity Relationship (ER) Diagramming techniques popularized a decade earlier using CASE tools.

As in the 1990’s, concepts such as “Enterprise Architecture” and “business rules” is indicative of the industry trying to reinvent systems theory.

CONCLUSIONS

Like computer hardware, the trend over the last fifty years in systems development is to think smaller. Developers operate in a mad frenzy to write programs within a 90 day time frame. Interestingly, they all know that their corporate systems are large, yet they are content to attack them one program at a time. Further, there seems to be little concern that their work be compatible with others and that systems integration is someone else’s problem. Often you hear the excuse, “We don’t have time to do things right.” Translation: “We have plenty of time to do things wrong.” Any shortcut to get through a project is rationalized and any new tool promising improved productivity is purchased. When companies attempt to tackle large systems (which is becoming rare) it is usually met with disaster. Consequently, companies are less confident in their abilities and shy away from large system development projects.

Corporate management is naive in terms of comprehending the value of information and have not learned how to use it for competitive advantage (unlike their foreign competitors). Further, they are oblivious to the problems in systems development. They believe their systems are being developed with a high degree of craftsmanship, that they are integrated, and that they are easy to maintain and update. Executives are shocked when they discover this is not the case.

The problems with systems today are no different than fifty years ago:

End-user information requirements are not satisfied.
Systems lack documentation, making maintenance and upgrades difficult.
Systems lack integration.
Data redundancy plaques corporate data bases.
Projects are rarely delivered on time and within budget.
Quality suffers.
Development personnel are constantly fighting fires.
The backlog of improvements never seems to diminish, but rather increases.
Although the computer provides mechanical leverage for implementing systems, it has also fostered a tool-oriented approach to systems development. Instead of standing back and looking at our systems from an engineering/manufacturing perspective, it is seemingly easier and less painful to purchase a tool to solve a problem. This is like taking a pill when surgery is really required. What is needed is less tools and more management. If we built bridges the same way we build systems in this country, this would be a nation run by ferryboats.

The impact of the computer was so great on the systems industry that it elevated the stature of programmers and forced systems people to near extinction. Fortunately, the industry has discovered that there is more to systems than just programming and, as a result, is in the process of rediscovering basic systems theory. Some of the ideas being put forth are truly imaginative, others are nothing more than extensions of programming theory, and others are just plain humbug. In other words, the systems world is still going through growing pains much like an adolescent who questions things and learns to experiment.

I have been very fortunate to see a lot of this history first hand. I have observed changes not just in terms of systems and computers, but also how the trade press has evolved and the profession in general. It has been an interesting ride.

Throughout all of this, there have been some very intelligent people who have impacted the industry, there have also been quite a few charlatans, but there has only been a handful of true geniuses, one of which was Robert W. Beamer who passed away just a couple of years ago. Bob was the father of ASCII code, without which we wouldn’t have the computers of today, the Internet, the billions of dollars owned by Bill Gates, or this document.

Originally published: 03/14/2006

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2014 by Tim Bryce. All rights reserved.

NEXT UP:  THANK GOD FOR FOX NEWS – Love them or hate them, we need Fox.

LAST TIME:  50 YEARS OF THE BRITISH INVASION  – Beatlemania started it all on this date in 1964.

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Dave Ettl & Lance Tormey (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Computers, Systems, Technology | Tagged: , , , , | 1 Comment »

IT’S ALL ABOUT TRANSACTIONS

Posted by Tim Bryce on October 21, 2013

BRYCE ON SYSTEMS

– Everything we do in systems and software involves the processing of transactions.

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

Every now and then when I write about the systems field I’m sure a lot of my general readers yawn. My thinking is, if I can educate the general public, they will be less likely to be duped by the programmers running corporate America today. As such, it is important for me to illustrate most of what goes on in the systems and software world is really not as complicated as people make it.

To illustrate, most of what we do in business is process transactions, representing some sort of action or event, such as a purchase, a return, a back-order, a debit or a credit. On the highways, counting the number of automobiles on the highway, tracking traffic signals, recording moving violations, or paying a toll. Transactions are used to record new employees or members of a nonprofit, or make changes to their profiles. Requests to produce reports or obtain files also represent transactions. Commands such as “New,” “Add,” “Delete,” “Print,” “Download,” “Open,” “Save,” “Search,” are common transactions familiar to anyone who has used a computer. The point is, everything is based on some form of transaction.

My programmer friends who write computer games believe this is nonsense. Oh really? How do you keep score in the game; by tracking every right and wrong decision the player makes during the time allotted? Hmm, sounds like transactions are being recorded to me. Even Facebook and the other social networking programs keep track of the number of postings you make, not to mention the cookies placed on your computer to track your activities. A program without any form of transaction serves no useful business purpose.

Transactions can be processed either one at a time (as in “interactive”) or in groups (“batch”). The challenge becomes processing the volume of transactions within an acceptable amount of time. This determines the physical constraints of the equipment to be used. “Batch” processing has the advantage of processing high volumes of transactions within a relatively short period of time per transaction. “Interactive” processing has the advantage of processing individual transactions quickly.

Just remember, all processing involves some form of transaction. There, that wasn’t too complicated was it?

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2013 by Tim Bryce. All rights reserved.

NEXT UP:  THE MOST STRESSFUL PLACE TO LIVE? – Is the Tampa Bay area as bad as it is being labeled?

LAST TIME:  “FEEL GOOD” TYPES – You know the type: They walk away clueless; happy, but clueless.

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), KGAB-AM 650 “The Morning Zone” with host Dave Chaffin (weekdays, 6:00-10:00am Mountain), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Lance Tormey & Brian Teegarden (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Computers, Software, Systems | Tagged: , , , , | 3 Comments »

WHY IS MY PERSONAL COMPUTER SLOWING DOWN?

Posted by Tim Bryce on September 30, 2013

BRYCE ON TECHNOLOGY

– Some simple tips to speed up your machine.

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

This narrative is primarily aimed at PC novices, people who have a rudimentary knowledge of how the computer runs and are frustrated as to why it seems to slow down for them. You know the type; people who are prone to cursing at their computer. They are also easy prey to be conned by specialists who want to tune their computers at exorbitant rates. First, understand this, the average life of a computer in business is one year. Home computers though can last a little longer depending on treatment, typically three to five years at most. This is a prime example of “Parkinson’s Law” as applied to computer technology.

The biggest problem though is you, the user, who is putting a lot of junk on your computer, both knowingly and unwittingly. If you are downloading software to your computer, either through the Internet or by CD, you are introducing several new files to your computer. If you are accessing the Internet with a web browser, such as MS Internet Explorer, Google Chrome, or Firefox, you are downloading files to your computer. And if you read e-mails from family, friends, or strangers, you are downloading files, all of which are scattered around on the hard drive of your computer. Sometimes vendors secretly place “cookies” on your computer which are files usually used for devilish purposes, such as monitoring what you are looking at on your computer or some other marketing trick. Even worse, spam and bugs may be introduced which is intended to either drive you crazy or hack into your private identity. This is why I’m a big believer of installing security software on the computer as a precursor to doing anything else. If you do not have such software, you are leaving yourself open to attack and theft.

One major rule you should live by: If you do not know the person sending you something on the Internet, DO NOT OPEN IT! If it seems to be too good true, it is! In all likelihood, it is some sort of spam aimed at disrupting your life, stealing your identity or e-mail address book, or setting you up to deposit money into a bank in Nigeria. Just let it go and delete the e-mail and its attachments.

Files are typically scattered around your computer and, despite the speed of your machine, may take time to assemble and load for your use. From time to time it pays to clean this up and, fortunately, there are some basic tools under MS Windows to help you in this regard:

1. Disk Cleanup
Go to Start -> “All Programs” -> “Accessories” -> “System Tools” -> “Disk Cleanup”
This will cleanup several files for you and free up some space. You should run this utility periodically.

Another program, “Disk Defragmenter”, is also available:
Go to Start -> “All Programs” -> “Accessories” -> “System Tools” -> “Disk Defragmenter”
However, you will rarely have to use it. Only turn to this utility if you are truly stuck. It will try to correct damaged files as well as reorganizing the hard drive for more efficient use. It typically runs for a long period of time. As such, I suggest you run it overnight.

2. Downloads
Go to Start -> “Documents” -> “Downloads” (under “Favorites”)
This will list the files you have downloaded to your machine, either intentionally or not. You may delete the files here, but be careful, this is a favorite hideout for spam files.

3. Security software – If you’ve got security software, such as Norton or McAfee, such tools usually have features which allows you to scan your hard drive for bad files and eliminate them. It may take awhile to perform, but it is worth it.

You will inevitably hear the expression “Cache memory” which is an area on your computer where files are processed. This is particularly useful for web browsers. For every web page you access with a browser, it is stored on your computer, either in the cache or elsewhere on your computer. Over time, these files can build up and become cumbersome to process. Consequently, it is necessary to clean out the files using your web browser:

1. MS Internet Explorer (v10.0.9 plus earlier versions)
Select “Tools” from the action bar -> select “Temporary Internet files and website files” (and anything else you think pertinent).

2. Firefox (v23.0.1)
History -> Clear Recent History (particularly select, “Browsing & Download History,” “Cookies,” and “Cache.”)
For earlier versions of Firefox, you may find it under “Tools.”

3. Google Chrome (v28.0 and earlier versions)
Go to the Settings (the three little horizontal bars to the right of the web address line).
Select “Settings” -> “Show advanced settings” (at the bottom of the screen) -> Under “Privacy” select “Clear browsing data” -> On the panel, select “Clear browsing history,” and “Empty the cache” (and anything else you think pertinent).

BACKUP

Regardless of your proficiency with the computer, I encourage you to backup important files in the event your computer is damaged or crashes. I cannot stress this enough. There is an ample number of software packages for such purpose, as well as those offered on the Internet. These basically take a copy of your files and moves them to another physical computer on the Internet. You can also do this yourself using an external hard drive or even a flash drive.

Most people at home have just a handful of files requiring backup:

* Picture files (JPG).
* Letters and other documents.
* Financial software.
* E-mail address books
* E-mail messages
* Web browser bookmarks or “Favorites”

For most people this can be accommodated by a simple flash drive. However, if you also have audio and video files, you may need something bigger. Whatever you select, I encourage you to have a game plan in place. I have seen far too many people lose files on their computers over the years. Even if you did nothing but copy and paste key folders and files to a flash drive (using your “Computer” from the Start menu), you’ll be way ahead.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2013 by Tim Bryce. All rights reserved.

NEXT UP:  LET’S GET REAL ABOUT BIGOTRY – Bigotry exists and it isn’t going away any time soon.

LAST TIME:  GET OUT OF JAIL, FREE – Does anybody go to jail anymore?

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), KGAB-AM 650 “The Morning Zone” with host Dave Chaffin (weekdays, 6:00-10:00am Mountain), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Lance Tormey & Brian Teegarden (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Computers, Technology | Tagged: , , , , | 5 Comments »

WHO KILLED THE PC?

Posted by Tim Bryce on September 6, 2013

BRYCE ON TECHNOLOGY

– It certainly wasn’t me.

(Click for AUDIO VERSION)
To use this segment in a Radio broadcast or Podcast, send TIM a request.

Nobody actually. However, I’m getting tired of hearing about the PC’s demise every few years. I think such statements are designed to sell magazines as opposed to having any validity. I read about the latest version of the death scenario in a computer trade rag and I think it was written by another prepubescent with little experience in the field, and swallows everything the vendors tell him. In the latest version, the PC’s demise is attributed to the advancement of the tablets and smart phones. I’m sure such devices have had an impact on traditional laptops, but I cannot imagine them having a significant impact on traditional desktops.

I like the “look and feel” of my desktops, not just the bigger screens, but the mouse and full keyboard. I’ve never been able to acclimate to small flat screen keyboards, particularly when writing voluminous documents. I can probably type 140 words a minute with a normal keyboard, but I feel tremendously restrained when trying to type on a tablet or smart phone. Not surprising, I think of desktop computers as “industrial strength” as opposed to the smaller devices which are useful for smaller and less important tasks. There is no doubt we are a mobile society, but if you need something of substance done, you need a desktop computer. This is why I believe the announcement of the death of the PC is bit premature. Consider this, if the PC is truly dead, the business world would be forced to shutdown as just about every network is dependent on it, as does small business.

Over the years I have also heard of the demise of the web browser (e.g., Internet Explorer, Firefox, Chrome, and Safari), but somehow they show no sign of abating. Then there is the supposed death of certain programming languages, particularly COBOL, which was primarily used on mainframe computers. Interesting, it is now over fifty years old but still keeps on truckin’, as are other programming languages and data base management systems. If you are a COBOL programmer, you’ve got a job for life as nobody will dare fire you in fear their legacy systems will somehow implode without you.

True, our technology changes rapidly, but I don’t think anything completely dies in the computer industry. We may not use punch cards much anymore, but I’ll bet there is an ample supply of card readers still out there “just in case”, as are archaic tape drives and other hardware/software devices.

No, the PC isn’t going to die any time soon. There is simply too many people imbued with the technology. I am also sure this will not be the last time we hear of its demise, particularly as other vendors want to promote an alternative technology. We should always be a little skeptical when we hear, “The sky is falling.”

Next time you hear the claim the PC is dead, simply mutter “Nonsense” under your breath and trash whatever you are reading.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M&JB Investment Company (M&JB) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
timbryce.com

Like the article? TELL A FRIEND.

Copyright © 2013 by Tim Bryce. All rights reserved.

NEXT UP:  “HOMO SAPIEN ASS****” – It’s a matter of acting on perceptions, not reality.

LAST TIME:  FIVE EASY LESSONS FOR SYSTEM DESIGN – It need not be as complicated as people make it.

Listen to Tim on WJTN-AM (News Talk 1240) “The Town Square” with host John Siggins (Mon, Wed, Fri, 12:30-3:00pm Eastern), KGAB-AM 650 “The Morning Zone” with host Dave Chaffin (weekdays, 6:00-10:00am Mountain), and KIT-AM 1280 in Yakima, Washington “The Morning News” with hosts Lance Tormey & Brian Teegarden (weekdays. 6:00-9:00am Pacific). Or tune-in to Tim’s channel on YouTube.

Posted in Computers, Technology | Tagged: , , , , | 5 Comments »

THE 4 STAGES OF BUSINESS/TECHNOLOGY GROWTH

Posted by Tim Bryce on March 4, 2012

(Click for AUDIO VERSION)

Back in the 1980’s I wrote a paper entitled “The 4 Stages of IRM Growth” which described the maturation process by how companies implement information resources (Information Resource Management). The paper was published not only in the trade press but was included in our book, “The IRM Revolution – Blueprint for the 21st Century,” (MBA Press) which was well received, not only in this country but in Japan as well. I recently stumbled across the paper again and, in reading it, I found it to be as applicable to today’s world as when I first penned it 25 years ago.

The paper descibes the characteristics of a company as it implements information resources in four stages: Birth, Childhood, Adolescence, and Adulthood.

BIRTH

The day a company goes into business is the day when its information systems are born. When a new company or organization is established, there are some very primal information requirements to accommodate the operation of the enterprise. For example, basic bookkeeping (billing, payroll, government reporting, etc), minutes of meetings, recording of policy decisions, schedules, correspondence, etc.

To implement these basic administrative requirements, simple office equipment is typically required, such as typewriters, calculators, photocopiers, telephones, fax machines, etc.

An Office Manager with a clerical staff (e.g., secretaries, book-keepers) normally implements these processes and operates the equipment. During this stage, their concern is for implementing basic manual procedures with an eye for work simplification to minimize overhead.

As the business expands and becomes more complicated, whether from an increase in employees and/or business, there is a growing demand for more information which leads to the next stage of growth…

CHILDHOOD

This stage is entered into either by an emerging company or an established firm that is pressured to investigate the potential of new technology, namely the computer, to give leverage to their business needs. This is a stage which most of the “FORTUNE 500” companies and major government institutions went through in the 1950’s, 60’s and 70’s.

In the childhood stage, the intent is to investigate the potential of the computer. This is an age of experimentation where a highly complicated and technical device is introduced to a company. This new technology, of course, requires a technically oriented individual to operate it. Someone who is more in tune with the equipment as opposed to the problems and objectives of the business.

The computer is typically centralized in one location until someone can determine an appropriate way to apply it to the business.

This stage results in the executive’s “black box” image of the computer. The executive doesn’t fully understand its capabilities and looks upon it suspiciously as a necessary evil. As a consequence, they divorce themselves from the machine and appoint a “IT Manager” who is given free reign over the new technology. Like the staff that supports him, the IT Manager is technically inclined (probably just one step ahead of a programmer).

The “IT Department” tackles simple problems aimed at automating some of the basic administrative routines of the company. There is not considerable pressure to satisfy business problems, only a “see what you can do” type of attitude. As a result, the IT staff takes an ad hoc, “quick and dirty” programming approach to problem solving. This type of philosophy sows the seeds for problems to come in the years ahead. For example, applications are not integrated, data is not shared (data redundancy is commonplace) and documentation is nonexistent, applications are not easy to maintain or modify. As a result, they are constantly being discarded and rewritten, further compounding the problem.

One of the most significant aspects of this stage is that it fosters the “tool oriented approach” for solving problems. The attitude of the staff is that the only legitimate problems worth solving are those that can be addressed by the computer. All others are immaterial. This is a frame of mind that will take considerable time to overcome. The indifferent attitude of the IT Department irritates and alienates end users who have increasing demands for information.

Impatient for results, management begins to apply pressure on the IT Manager for more applications to satisfy user demands. This leads to the next stage …

ADOLESCENCE

This is the age of awakening for most companies, an era when the IT Department begins to manage itself in order to accommodate growing business demands. The IT Manager is supplanted by an IT “Director,” someone who is a little more adept at management politics.

In this stage, the IT Director implements rudimentary management controls, particularly in the areas of project management and documentation. Using the “tool oriented approach” to improve staff productivity, the IT Director implements several software tools and techniques, such as: Data Base Management Systems (DBMS), Program Generators, Report Writers, Fourth Generation Languages (4GL), Computer Aided Software Engineering (CASE), etc.

Dazzled by sophisticated software and in fear of “falling behind” in the technology race, the IT Director authorizes the purchase of tools that implement esoteric (some prefer to call it “Voodoo”) management principles.

Unfortunately, the IT Director is seduced and abandoned by the technology; the results are still the same: Applications do not satisfy user needs, applications are not integrated, data redundancy is still pervasive, applications are still difficult to modify and maintain, and the staff remains a free-spirited group of technicians.

The “tool oriented approach” is very costly to the company, but the results are still the same. The IT Director is still supported by a technical staff that believes that the “real work” is in the production of software, where their programming skills excel. The “Analyst/Programmer” is really nothing more than a senior programmer.

Superficial standards and pseudo-scientific management techniques are applied to the development process. An application project typically consists of the classical approach for developing systems: A primitive Feasibility Study, General Design (sometimes referred to as “External Design”), Detail Design (“Internal Design”), Programming (usually following a Structured Programming Guru’s technique), Testing, Installation, and Review. In this situation, programming remains 85% of the entire project. This approach is usually well packaged in voluminous standards manuals (which no one but the Auditors read).

The computer is decentralized with mainframes, minis and micros being distributed throughout the company.

The end User, who is frustrated by the lack of support from the IT Department, turns to the Personal Computer (PC) for help. Unfortunately, the User is no more adept at using the computer to solve his business needs as the IT people are and the problems are compounded even further (particularly in the area of redundant data).

Despite the substantial investment in computer hardware and software thus far, executive management finally recognizes that conditions are intolerable and that the company is not getting a satisfactory return on investment. This becomes the catalyst for change. Without it, the company stagnates and the situation worsens. Adolescence must eventually give way to …

ADULTHOOD

This stage represents a radical departure from the past mode of operation. Very few companies, if any, have reached this stage of growth yet. It represents a mature environment where the systems staff is in tune with the mission of the company, and information is viewed as a corporate asset used for strategic purposes. This is the age of Information Resource Management (IRM). This philosophy gives rise to the Chief Information Officer (CIO), a true and legal officer of the company, not just a job title. Such an officer reports, at least, on the same level as the Chief Financial Officer (CFO).

No longer is the “tool oriented approach” pervasive in the company. It was tried, and it failed. The latest “state of the art” technology is a worthless status symbol if it doesn’t contribute to the profitability of the company.

Now, the CIO turns to tried and proven approaches to management. Information Systems design is no longer viewed as an art, but a science. The CIO organizes the systems development environment into an engineering/manufacturing company, complete with Assembly Lines, Production Control and Materials Management. As a result, the systems staff is transformed from free spirited programming “hackers” to a group of disciplined and quality conscious business professionals. In some respects, the staff will resemble the “Systems and Procedures” staff of yesteryear who had a business orientation.

The computer is viewed as just another piece of office equipment; they are not discernible. Users and management no longer fear technology because the CIO implements it effectively into the business. In the adult stage, the emphasis is on complete and integrated information systems, not just software. Programming is less than 15% of the entire development process, with the bulk of the work being expended on business analysis. Data is managed as a resource and redundancy is eliminated. All of the problems experienced earlier disappear.

As enticing as adulthood may sound, very few companies have the management skill or fortitude to make it happen, particularly in the United States. Most companies don’t even understand the problem. Adulthood represents a substantial and long-term corporate commitment, not just departmental commitment, which most American companies strongly resist. Instead, they are content with short-term “quick and dirty” solutions. On the other hand, Asian companies, who are much more far-sighted, have a greater chance for success and are rapidly moving into the adulthood stage. This will make them increasingly more competitive in the years ahead.

THE FOUR STAGES OF MATURITY

CHARACTERISTICS BIRTH CHILDHOOD ADOLESCENCE ADULTHOOD
APPLICATIONS Basic Bookkeeping Program basic administrative routines Major systems Information as asset & strategic weapon
EQUIPMENT Basic Office Equipment Centralized computer Decentralized computing Computers blend in with office equipment
PERSONNEL Office Manager & clerical staff IT Manager & technical staff IT Doirector & programmer/analysts CIO & business oriented staff
ENVIRONMENT Concern for manual processing;
Work simplification
Experimentation “See what you can do”;
Beginning of the “tool oriented approach”
Awakening. Applying rudimentary management techniques & tools Age of IRM. Strong management;
Science vs. Art;
Discipline, organization & quality consciousness

CONCLUSION

Over the last ten years alone, computer technology has changed radically, job titles and terminology have changed, and salaries have risen sharply, but little else has changed. The information problems of today are no different than 10, 20 or 30 years ago. Despite today’s technology, companies still experience:

* Project cost overruns and slipped schedules.

* Poor communications and relations with the User community.

* Redundant data and lack of application integration.

* Applications are difficult to modify and maintain.

* Lack of adequate documentation.

* Design inconsistencies.

* Applications still do not satisfy User needs.

* Hardware/Software dependencies.

* Employee dependencies to maintain systems.

The tools and characters have changed, but the tune remains the same. Regardless of the titles and technology used, most companies in North America are stuck in either the “Childhood” or “Adolescent” stages of growth. Indicative of this are the journals, trade groups, universities, and trade shows that still promote the “tool oriented approach” as opposed to promoting management. Systems development is still viewed by many people as an art, not a science. In reality, it is a science. It has established and proven concepts and can be taught as a science.

“No amount of elegant technology will solve our problems, only strong management will.”
– Bryce’s Law

EPILOGUE

Again, it has been 25 years since I penned this paper but I do not see anything in the corporate world to cause me to change this model. Whereas American businesses tend to be stuck in the Adolescent stage, many companies in Japan and Europe have moved on to the Adulthood stage. As long as our viewpoints remain focused on technology and not the big picture of total systems, America will continue to lose its competitive edge.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M. Bryce & Associates (MBA) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
http://www.phmainstreet.com/timbryce.htm

Like the article? TELL A FRIEND.

Copyright © 2012 by Tim Bryce. All rights reserved.

Posted in Business, Computers, Systems, Technology | Tagged: , , , , | 1 Comment »

IS SOFTWARE HARD?

Posted by Tim Bryce on February 26, 2012

(Click for AUDIO VERSION)

For something that is supposed to be “soft”, computer software exhibits some pretty “hard” characteristics. The original premise behind the COBOL programming language was to devise a language that could be easily ported to several computers. This never truly happened due to computer manufacturers who tweaked the language to suit their particular needs. What ran on an IBM machine, for example, didn’t necessarily run the same on Honeywell, UNIVAC, or the rest of the BUNCH. Consequently, software developers had to maintain different versions of source code to suit the particular needs of the various computer compilers. This plagued all third generation languages until Sun introduced JAVA in the 1990’s. The JAVA premise that a programmer should “write once, run everywhere” was the right idea and the language began to gain momentum, until it ran into Microsoft who didn’t want to turn the operating system into an inconsequential afterthought. JAVA lives on, but not to the extent it should have, and developers are back to managing separate versions of source code.

The point is, software does in fact exhibit some very “hard” characteristics as it is married to the host computer configuration which doesn’t make it very portable. As mentioned, this creates headaches for those of us, particularly commercial software vendors, in terms of maintaining consistency in the different versions of our products.

What to do?

Back in the 1970’s and 1980’s our company was faced with the dilemma of managing a single product on over a dozen different computer platforms. We quickly came to the realization we would go stark raving mad managing multiple versions of source code and came to the conclusion we had better come up with a solution pretty quick. Because of our experience in converting software, we became well versed in the nuances of the various compilers and devised a Repository (we called it a “filter program” at the time) which maintained the rules of the various compilers. We were also very disciplined in writing code to specific standards and embedded certain switches in the base source code. When we were ready to produce a new release of our product, we would feed the base code into our “filter program” which would then create the different versions of the source code ready for compilation. This saved us an incredible amount of time and brought consistency to all of the versions of the product. In other words, our programming staff worked with only one set of programming code (not multiple variations). The “filter program” then analyzed it and created the necessary permeation for a targeted platform. As compilers changed, we would update the “filter program” accordingly.

We also learned to maintain print maps, screen panels, messages and help text separate from the source code, which greatly enhanced our ability to create a new version of the product to suit a foreign language and culture; see “Creating Universal Systems.”

Let us take it a step further, for years we have touted there are logical and physical dimensions to Information Systems. We look upon Systems and Sub-Systems (business processes) as logical constructs, and Procedures and Programs as physical constructs. Further, data components such as inputs, outputs, files, records and data elements can be specified logically and implemented physically many different ways. Let me give you an example; back in the 1980’s one of our customers (a large Fortune 500 electronic conglomerate) bought into our logical/physical concept and decided to put it to the test. Working from their headquarters, they designed a complete Payroll System which they wanted to implement as the corporate standard across all of their divisions and subsidiaries. They completed the system with a recommended programming solution they wrote themselves (no packages were used) which I believe was an IBM MVS solution using COBOL. However, they recognized this implementation wouldn’t work across the board in the company. Consequently, they gave the system specifications to all of their divisions who would then program it themselves in-house. The project turned out to be a major success and the company ended up with multiple implementations of the same system under IBM MVS, VM, Honeywell GCOS, UNIVAC Exec, HP MPE, DEC VAX/VMS, and Prime; all working harmoniously together. Other customers experienced similar successes, particularly in Japan.

All of this drives home the point that systems are logical in nature, and that programming is physical. If systems are designed properly, there is no reason they shouldn’t behave identically on whatever computer platform you come up with. Better yet, it allows us to easily migrate our systems from one configuration to another. Uniformity and consistency in execution; and portability to boot. Imagine that.

“Systems are logical, programming is physical”
– Bryce’s Law

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M. Bryce & Associates (MBA) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
http://www.phmainstreet.com/timbryce.htm

Like the article? TELL A FRIEND.

Copyright © 2012 by Tim Bryce. All rights reserved.

Posted in Computers, Software, Technology | Tagged: , , , , , | 3 Comments »

THE MYTH OF THE PAPERLESS WORLD

Posted by Tim Bryce on January 22, 2012

(Click for AUDIO VERSION)

For a society bent on becoming paperless I find it rather amusing that sales at corporate giant International Paper Company actually increased from $21.9B in 2006 to $25.1B in 2010. It doesn’t sound like the demand for paper is diminishing, does it? In reality, printing is being offset from one party to another. To illustrate, the many community and civic newsletters which used to clog our mail boxes have been replaced by PDF files which are commonly e-mailed or downloaded from web sites. People print them in part or in full as opposed to the publisher, thereby transferring publication costs to the consumer.

Financial institutions were quick to jump on the bandwagon. Most, if not all banks in this country have abandoned printing and mailing monthly statements thereby forcing the consumer to print them instead. Reluctantly, they’ll still mail you statements if you must have them, but they desperately want to get out of the printing business. The government has followed suit. Whereas taxpayers used to get their IRS booklets and forms through the mail, now the consumer is expected to download and print it themselves. No wonder the United States Postal Service is going broke, there is nothing to mail anymore.

Our company has maintained a post office box for a number of years now. In the past we could count on receiving at least 100 pounds of junk mail annually, but this has dropped off substantially. Now we barely get a post card. Instead, our e-mail queues are overloaded with spam despite the blockers we have in place. If we find an ad for something we are interested in, we’ll dutifully print it (not the retailer).

Perhaps the two biggest areas of e-paper is in travel reservations and retail sales. For example, airline tickets in the past were printed and mailed to you. Now, the consumer is expected to print them instead. The same amount of paper is produced, only you are paying for it. Retail sales are no different; the consumer must print a receipt if he is so inclined, not the vendor.

When you walk into an office supply store, one of the biggest items commanding your attention are the skids of stock paper available to you. Somebody must be buying all this paper, and most likely it is the consumer as opposed to businesses who have it delivered directly to their offices. Aside from paper sales, the sale of printers, cartridges, and paper shredders are also doing well, thereby indicating a robust print industry is still alive and well.

The transition of printing costs directly impacts your cost of living. Sure, paper is relatively cheap, but the cost of printers and ink cartridges add up over a year’s time. Going paperless may reduce the costs of the organization producing the documents, but it certainly adds to the cost of the consumer, such is the price of progress I guess. No wonder sales at International Paper is increasing unabated. Maybe the folks at Dunder-Miflin are on to something after all (from NBC’s “The Office”).

So, is the world really going paperless? Ask International Paper.

Keep the Faith!

Note: All trademarks both marked and unmarked belong to their respective companies.

Tim Bryce is a writer and the Managing Director of M. Bryce & Associates (MBA) of Palm Harbor, Florida and has over 30 years of experience in the management consulting field. He can be reached at timb001@phmainstreet.com

For Tim’s columns, see:
http://www.phmainstreet.com/timbryce.htm

Like the article? TELL A FRIEND.

Copyright © 2012 by Tim Bryce. All rights reserved.

Posted in Business, Computers, Life | Tagged: , , , , , | 2 Comments »

 
%d bloggers like this: