White Paper

ASP Data Entry: A White Paper Primer (Cont'd)

To return to the beginning of this white paper, please click here.

Appendix A: ASPs


I. What Is An ASP?


An Application Service Provider, or ASP, hosts software applications much the same way an Internet Service Provider (ISP) hosts Web sites — the software and data reside on a server and you access them over the Internet.

The ASP model offers the potential to revolutionize the way companies operate. However, the technology is simply an evolution of the networking model. Instead of having the PC on your desk networked to other PCs in your office, and your application on the server down the hall, your PC is now part of a much larger network — the Internet. Your application resides on a server that is connected to the Internet. When you log onto the Internet, you can access the application.

In some ways, ASPs are also similar to the time-sharing model of mainframe access that was popular in the 1960s and 1970s. You reach your software and data remotely and you are sharing computing power with others. However, with ASPs you can access your data anywhere there is an Internet connection — you do not need to use a particular terminal.


II. Three Common Misconceptions About ASPs

ASPs offer a new business model, one that has only existed for the past year or two. As a result, many people have misconceptions about ASPs.

Here are the three most common:

A. "The Internet Is Slow."
This is a particularly common belief among people who are accustomed to accessing the Internet from a dial-up connection at home. Actually, using an ASP can be faster than running software over your network or on your PC. There are two reasons for this.

The first is that we recommend using a high-speed connection. DSL is preferable and usually very cost effective where it is available. The faster the connection, of course, the faster the processing will be.

But even with relatively slow dial-up connections, like 56 K, using an ASP will be faster. This is because most ASPs use Citrix MetaFrame to speed processing. With Citrix, all processing occurs on the server — not on your client PC. All that passes between your client and the server is your mouse clicks and keystrokes and screen updates from the server. Their ICA technology only sends changes to the screen. It does not refresh the entire screen like your browser does when you surf the net. Since the volume of data passing along your Internet connection is significantly reduced, your effective speed is much faster.

B. "The Internet Isn't Reliable."
While the Internet does have outages from time to time, no computer system is immune from going down. Networks are frequently down, which means your application may not be available, and PCs lock up and have to be rebooted or repaired.

Billions of dollars have been poured over the past couple years into creating redundant networks, so that even if there is an outage, your connection can be rerouted to other lines. This has added significantly to the reliability of the Internet.

C. "The Internet Isn't Safe."
Regular news stories about hackers and viruses have left most people with a healthy concern about security over the Internet. Most ASPs take extensive precautions to ensure the security of your software and data. Servers are typically located in secure data centers, with around-the-clock security and protection against fires and other natural disasters. Data transmissions are usually encrypted and protected by firewalls. When the ASP is hosting an e-mail program or other application that leaves it vulnerable to viruses, virus protection programs are run regularly and updated frequently. Backups are also run regularly and saved to off-site storage. In fact, many ASP customers find the protections ASPs employ are more rigorous than the ones they previously had in place.


III. Typical Benefits of ASPs

A. Geographic Flexibility
One of the primary benefits of using an ASP is that you can work anywhere you can connect to the Internet. Because your software and data are all accessed over the Internet, you are no longer limited by what files you brought with you on a floppy or what you can get your office to overnight you. No matter where you are, you can do the same work you could in the office. As a result, organizations whose staff travel extensively or often work outside the office are among the most drawn to ASPs.

A corollary is that a company is no longer limited to hiring those people who can come to the office. Telecommuting, on either a temporary or permanent basis, becomes easier to arrange. Part-time staff can work from home on a regular or as-needed basis. As a result, ASPs offer organizations unprecedented flexibility in staffing. This also saves money because the company no longer has to invest in infrastructure (like office space, furniture, etc.) to support maximum staff levels.

B. Relief To IT Staff
Everyone is aware that the demands on an organization's IT staff are extensive and ever-increasing. It is becoming harder and harder for companies to find qualified IT staff and retain them once hired. With an ASP, some of the IT staff's responsibilities are handed off, relieving some of the burdens on their time and freeing them for other projects. For example, ASPs typically are responsible for implementing and integrating new software, rolling out upgrades and bug fixes, maintaining the applications they host and providing support.

ASPs can also usually provide a much deeper level of expertise than a company's own IT staff. Because they are supporting multiple companies, they can justify having certified experts in a wide variety of areas. This, along with the benefit of using standard software and hardware configurations, results in faster problem resolution.

C. Cost
Most companies find that ASPs save money over traditional software implementations, particularly when the cost comparison is based on the Total Cost of Ownership (TCO) of an application: software, hardware, implementation, training, staff, etc.

A large area of savings is that with an ASP there is typically no large up-front expenditure for software, the hardware needed to run the application, and hiring staff to implement and manage the software.

Having one predictable monthly expense makes budgeting easier. Paying on a per-user basis, as is typical, allows a company to tie costs more closely to revenue. This is particularly useful for new businesses or companies just entering a new line of business. Organizations that have seasonal fluctuations in the number of users also benefit.

D. Hardware Savings
ASPs use the "thin client" model, with little processing done on the client side. As a result, even old PCs make very adequate workstations. It no longer is necessary to upgrade clients every two years to keep up with increasing demands of the software.

The requirement that all departments standardize on one type of equipment also disappears. If one department wants to use Macs, that is not a problem. PCs, Macs, laptops — all can run the same software over the Internet.

E. Focus On Core Competencies
Being able to get up and running quickly, with little required from the corporate IT department, means that the company can focus all its attention on the goals of a project and devote its effort to its core competencies. You are able to do what you do best and leave running the applications to the ASP.

F. Speed To Market
Since the software is implemented on one central server, with standard hardware and software and usually involves minimal customization, implementations are typically very fast. Applications that usually take several months to install can be available for use in weeks or days. When the IT department is backed up or a project simply has to ramp-up quickly, an ASP is the perfect solution.


IV. Issues to Consider When Working With ASP-Hosted Software

A. Service Level Agreements (SLAs)
"Service level agreements" (SLAs) that promise a guaranteed level of availability are offered by virtually all ASPs. Typically, they promise that you will be able to connect to your application 95% of the time or more. However, most don't offer much recourse beyond a pro-rated refund if they are down more than the promised amount of time. Given that, it is more important to find out what the ASP has done to make sure they can deliver what they promise than to look at a specific percentage.

B. Integration/Customization
Most ASPs offer relatively little in the way of customization. It is important to find out what can be modified on the application you will be using and see if that meets your needs.

C. Support
Some ASPs provide support themselves, while others refer you to a third party. Your best support will come direct from the vendor, as they know the software best.

D. Usability & Quality Of The Software
Since you will probably want only minimal training and users may use the software on a more occasional basis, it is particularly important to choose software that is easy to use, reliable and very high quality.

E. Cost
It is important to clarify exactly what the quoted price covers and what additional costs there might be.

F. Control/Security
Security issues should be discussed with the ASP. In addition to protecting data from hackers, you need to make sure you can maintain the level of control you need over approved users of the application and data.

G. Partners
Virtually no ASPs are sole-source providers. Every company specializes in what they do best — whether that is developing software, running a data center or providing communications services. Finding out who your ASPs partners are will give you greater confidence in their ability to deliver the service you need.


Appendix B: Professional Data Entry Software

(Sections II and III by John R. Haley
Reprinted with permission from the TAWPI Certification Handbook
)


I. Why Are We Still Keying In Data?

Recognition technologies, such as Optical Character Recognition (OCR) and Intelligent Character Recognition (ICR), have the potential to achieve considerable savings over keying data. But the technology is not yet at the point where it works flawlessly. Despite constantly improving recognition engines, much information cannot be identified or is interpreted inaccurately. Corrections have to be made by keying in the data.

An additional issue for many businesses is that although one company may have data in electronic format, their software and systems may not be compatible with those of another business. Even if the systems are compatible, they may not trust each other or do business often enough to justify setting up electronic data exchanges. As a result, they have to regress to the lowest common denominator and transmit data on paper.


II. Why Is Data Accuracy Important?

Although the initial setup and implementation costs of a data capture project are not insignificant, operating costs typically dwarf them. Operating costs continue for the lifetime of the image project. Typically they will be orders of magnitude higher than the acquisition and implementation costs.

The largest component of operating costs and one of the most often underestimated elements of a data capture project is the labor cost. Labor costs account for 90-95% of the life cycle cost of a data capture system.

There are two issues that are crucial to minimizing the labor cost on any data capture project. Those issues are the speed with which data is keyed and the accuracy with which data is entered.

Speed is important because the faster each operator can key, the fewer operators are needed. With labor costs the largest ongoing expense for most data capture projects, even small improvements in keying speed can result in substantial savings. Even today, techniques for enabling high-speed keying are not widely known and are usually found only in specialized data entry systems.

The other issue is accuracy. Besides the labor cost of finding and repairing errors, there are the intangible costs and effects of inaccurate data on your business.

If your data is text that will be read but not processed further, errors may not be important. Readers can get the meaning from the context. However, most of the time your data will be used in other applications, and the old "garbage in, garbage out" rule applies.

It is impossible to overemphasize the importance of correct data. Data errors are often the most costly aspect of data capture, costing the organization even more than the original data entry. Unfortunately, many of these costs are intangibles or are not easily measured, yet they are very real.

When errors are detected early in the data capture process the repair cost is very little — only the time to re-key the correct character or field. Many errors can be found by having the computer check and validate data automatically as it is keyed. Table look-ups, spell checkers, and other programmatic tools offer comprehensive coverage at a very low cost.

Errors that are not detected until after the data has been processed are more expensive to correct. There may be an hour or more of clerical time involved in finding the document and entering the corrected transaction. It is even more costly when the bad data has been propagated throughout the system and has to be corrected in more than one place. The time lost in processing errors can also have a substantial cost. For example, shipments may be delayed and invoices may be sent out late.

The longer it takes to discover an error, the more harm it can cause. Erroneous part numbers may result in shipping the wrong product. Incorrect customer numbers lead to all sorts of unpleasant and expensive problems. Invalid quantities and amounts also cost the company far more than the cost of the data input. The result can be serious customer service problems.

So, an important goal of any data capture project must be to control costs by maximizing data entry speed and minimizing errors.


III. How Does Professional Data-Entry Software Increase Speed And Accuracy?

There are many factors that influence keying speed. There is no single overriding factor, but a whole series of little things that combine to allow the keyer to reach his or her maximum potential. Interestingly, studies have shown that the fastest keyers are also the most accurate. This means that these factors are important even if you do not expect blinding speed from the keyers.

A. Ergonomics
The basic strategy behind speeding data entry is to minimize hand and finger motion so operators can key as quickly as possible. But pure speed, especially if it comes in erratic bursts, is not enough. Rhythm is a very important factor in high-speed keying. The best operators talk about the importance of maintaining a smooth rhythm to the keying. Furthermore, fast keyers actually look ahead of where they are keying. There is a discernable time lag between the eyes and the fingers. This means that to maximize speed operators must be able to look ahead.

Here are some of the techniques that can be used to speed keying and help maintain rhythm:

Minimizing usage of the mouse. The conventional wisdom is that a Graphical User Interface (GUI), such as Windows, is the easiest, fastest approach to entering data. However, methods and techniques that are best for programmers and knowledge workers are often counterproductive for production data entry. The "intuitive" feel of a GUI approach is great for non-repetitive tasks because they are not performed often enough to achieve proficiency. But when entering large amounts of similar data, it is essential that the operator's hands never leave the keyboard. Every time an operator takes her hand off the keyboard to use a mouse she loses time and momentum. Professional data entry software is designed to allow the operator to accomplish everything she needs to without using a mouse.

Using 029 keyboard for numeric keying. The fastest keyers traditionally have used the "029 keypunch" keyboard layout that has the numeric keys underneath the right hand. This allows numeric characters to be keyed at the maximum rate without moving the hands from the home keys. Good data entry systems can emulate this capability on a standard keyboard.

Using Function Keys. Production keyboard data entry takes advantage of the function keys to build in specific features and functions for data entry. For example, function keys can move the cursor one character or one field ahead (or behind), or to the last (or first) field. Function keys can also be used to replicate data that has already been entered, saving a considerable number of keystrokes, and to skip fields that are often empty. The alternative, removing the hand from the keyboard to manipulate a mouse to perform those functions, is not nearly as efficient.

Completing Fields With The ENTER Key. In professional data entry systems, the ENTER key is used to complete a field. ENTER is a big key under the right hand, making it particularly easy to reach. Most people find it more intuitive to hit ENTER at the end of a field than to use the TAB key, which is the way Windows applications typically signal the completion of a data field. Data entry systems use the TAB key to advance over data entry fields to some predetermined field.

Making Corrections. When errors occur and are caught programmatically, such as when letters are entered into a numeric field, error messages must lock the keyboard. The keyer should be required to acknowledge the error message and correct the error. Ignoring error messages is a common cause of inaccurate data.

B. Validation Features
Since the sooner an error is caught the less expensive it is to fix, every effort should be made to detect errors as quickly as possible. Data can be validated as it is entered at three different times. First, each character should be sieved to allow only acceptable characters (i.e., not letters in a numeric field). This should be accomplished at the time the key is depressed so it can be corrected while the eye is on the data and the finger is on the key.

Second, as each field is completed, date validations, range checks, table look-ups and calculations should be performed.

Third, when the record is complete, balancing and multi-field validations can take place.

Here is an overview of the most common data validation techniques:

Character Sieves. The first line of defense against keying errors is to eliminate invalid keystrokes as they occur. For example, the system should not allow alphabetic characters in numeric fields. Depending on the data, it may be possible to perform more sophisticated single character validations. For example, if there is a field for gender, it may be desirable to allow the operator to enter only "M" or "F."

Field Edits. As soon as a field is entered, it should be edited. A wide variety of edits is needed, ranging from simple numeric range checks to database look-ups and computed values. Field edits should have multiple levels of capability. For example, the first level date validation is to ensure the month is in the range from one to twelve. A deeper level detects future dates and unreasonably old dates. The common functions should be provided by the system and there should also be a mechanism for easily adding custom edits that are unique to the application.

Table Look-Ups. Comparing field values with database tables of acceptable values and other information is one of the most important types of field edits. Table look-up with substitution is a valuable "keystroke saver" that improves productivity. For example, the operator could key in AL and have the word Alabama appear in the field. However, this process must be extremely rapid and efficient and never delay the keyer. This requires very good programming techniques and well-designed systems to be effective.

Computed Values. To achieve maximum productivity and accuracy the system must provide for flexible and fast field edits to compute values based on one or more data fields and database values. For example, if a series of items is entered for an invoice, the system might calculate the sales tax, shipping charge and total.

Check Digits. Many fields, such as account numbers, have a self-checking digit in the number that can be used to detect errors in the value. Dozens of check digit algorithms are in use. A good data entry module can handle them all.

Field Duplication. A common method of keystroke reduction is the capability to duplicate the values in previous fields and from previous documents. Often the documents are grouped by location, date, account number and other fields that can be duplicated. Default values are a special case of duplication. The keyer and the system both should be able to duplicate information from other fields.

Optional And Required Fields. Some fields must always be entered and the system must enforce this requirement. Other fields are optional. Those that are seldom entered should have a default set up to automatically skip them. But there must be an efficient way for the keyer to enter the fields when necessary.

Intelligent Field Skipping. The best systems have the ability to dynamically skip fields based on the data entered in previous fields. For example, if a person is single, the fields relating to a spouse might be skipped. This feature speeds key entry and reduces the opportunity to make errors.

Balancing And Multifield Calculations. Some errors cannot be detected until all of the data for a transaction has been entered. Financial data in particular needs "balancing" validations. These validations must be essentially instantaneous or else the user cannot reach his or her potential productive speed.

Double-Key Verify. The most time-proven method to ensure accuracy of key-entered data is to key it twice, preferably by different people, and compare the results. This is called double-key verify or rekey verify. Historically, double-key verify has been used to achieve data accuracy exceeding 99.99%. Usually, one person keys the data and another person then keys the same data without seeing what the first operator keyed.

As the verify operator keys, the software compares what is entered character by character with the original data. It stops instantly when there is a difference. The verify operator then checks to see if her data is correct, or if she made an error. When the verify operator determines that the original data is in error the erroneous characters are replaced. The best keyers should be the verify operators because they are responsible for quality control. There is no need to double-key verify fields that can be validated by other methods, so this feature is not necessary for every field. Only the fields that cannot be validated programmatically should be double key verified.

John R. Haley, President, Viking Software Services, Inc.