This article was is from Sage's whitepaper (PDF) of the same name, authored by Grant Howe. Sage offers donated Sage Business Solutions products and discounted Sage Payment Solutions services to eligible nonprofits and public libraries through TechSoup.
Making mission critical business applications (software solutions) available remotely are hot topics these days. Remote access allows staff to work from any location without being tied to a specific physical location. With the amount of consolidation, decentralization of offices, and home or travel-based positions, being able to offer solid remote access technology to your staff, board, and volunteers is critical for your success.
Figuring out what to do, or even where to start, can be a significant quandary. Not only must you take into account the architecture of the application you want to access remotely, but you must also consider bandwidth limitations (on both sides of the solution), your data security and compliance needs, and your IT management and administration strategy.
In part one of this article we will introduce you to the terminology and concepts of remote access and provide a platform for you to begin your evaluation. First, we will look at different remote technology options and the various software architectures for which they are appropriate. In part two, we'll give you tips on how to best evaluate your current internal resources and potential vendor partners to create the best solution for your needs.
Types of Applications
In order to create the appropriate remote access scenario, it's important to understand the architecture of the application that you want to access remotely. This is the first step, as some technology choices may not be feasible, based on the application architecture. If you don't know which of these architectures you are using, simply contact your IT staff, business partner, or vendor for guidance.
The standalone or client-database architecture is when a larger, dedicated application is installed on each user's desktop. This is sometimes called a "fat client." It connects directly to a database, which may be either on the same machine as the client or over the network. This solution is usually used when only a small number of users need access to the application.
The client-server architecture is similar to the client-database model. It also consists of a larger footprint application installed on each user's desktop, but this "fat client" connects to a version of the software installed on the server. The server application then applies business logic before interacting directly with the database. An example of an application set up in this fashion is an email client, such as Microsoft Outlook. The client software must be installed on the workstation and must be configured to connect to a server to get data.
A web-based architecture uses a web browser as the client and requires minimal software to be installed on the user's computer. This architecture doesn't require a large footprint application to access the database as it uses a standard web browser instead. The web browser works with a web server to deliver a browser-based user interface (UI) to the end user. The web server may interact with other application servers to run business logic and return results to the user by way of the browser user interface. The database is usually installed on a different server than the web server. Web-based email is perhaps the most ubiquitous example of web-based computing today.
Remote Technology Options
There are a number of technology options for gaining remote access to applications, each with various bandwidth requirements and security considerations. We will explore some of the more popular ones and discuss the pros and cons of each.
Tunnel In: Virtual Private Network
A virtual private network (VPN) is a secure tunnel between a remote user and your internal network. The user creates a session with your VPN server or firewall appliance and then is allowed to pass data directly into your network. It is just like the user is plugged into a wall jack at the office, except for one very important difference: the bandwidth that the user can use is limited by lesser of their and your available bandwidth to the Internet. In other words, the maximum size of the "pipe" is determined by whichever end passes the smallest amount of data.
Most offices have 100 megabit per second (Mbps) connectivity internally and significantly less out to the Internet (1.5 Mbps perhaps). Most homes have even less – even with a broadband connection. Your users' experience may be sluggish with your application or the connection may be too unstable for the application to maintain a connection to the server. This makes VPN a challenging option for solutions that use a large-footprint application installed on the local machine. However, VPN does work well for web-based applications if additional security is desired, since having users log on through the firewall provides another layer of security protection for your web server.
A Virtual Window: Remote Desktop Connection
Remote desktop services allow you to host an application on a remote server and transfer what amounts to screen shots back to the client. Keyboard and mouse inputs are forwarded to the server and the results are shown on the subsequent screen shots that come back. Think of it like using your computer as a virtual window into the server where the application is installed.
This technology allows you to offer a traditionally locally-installed software solution to users remotely without needing to boost their bandwidth for the application to communicate with the server effectively. The "screen shots" are compressed so the RDC uses a constant, but small amount of bandwidth.
Older versions of this technology presented an entire desktop for the user to use as essentially a remote workstation. The 2008 server version of Microsoft's Terminal Services now allows for the publishing of applications only, if you so choose. Windows Server offers even more optimized configuration options for remote access. The end result is that the user can click on an icon in the start menu, start the application, and use it like it was installed on the local machine, except that it is actually running on a remote server.
Citrix XenApp is Citrix's version of Terminal Services and allows publishing of applications the same way. XenApp may allow for nicer administration of the applications and a better user experience and depending on your configuration, it may also offer you the ability to work with mobile devices. However, licensing and implementation costs are typically higher than with Terminal Server. Either solution (and RDC in general) is a good choice for remote access of applications that utilize a large-footprint user interface.
RDC can be managed internally by your own IT staff, but many small- to mid-sized organizations choose to partner with specialized technology and hosting providers. While the level of service varies with cost, hosting relieves considerable and possibly all IT burden from your staff. For example, an Application Service Provider (ASP) takes an application, puts it into a hosting infrastructure, and sells the use of the software directly to customers. The application is typically one built to be installed directly on a client machines, but the ASP uses Terminal Services, Citrix XenApp, or another technology to take the administrative burden off of the end consumer. Later, we'll discuss how to evaluate the right partner for your needs.
A web-based application does not need RDC to be set up on a client machine. Data passes over the Internet as encrypted web traffic. Often, it is specifically built as a software as a service (SaaS, also called "cloud computing") offering and requires no IT department interaction to sign up for it and begin use right away. Normally only the software vendor can offer SaaS to customers.
Security Considerations and Compliance
No evaluation like this would be complete without a discussion of security and compliance. There would be no quicker way to draw a halt to your campaign, traumatize your constituents, and give your organization a black eye than to have a security breech resulting in a loss of personal information. If your organization has neither the knowledge nor the skill set to build and execute a solid security plan, then seek outside help from a professional.
General Security Best Practices
Security best practices involve the use of a properly configured firewall, antivirus protection, automated patching of operating systems, and security policies and procedures. Other areas to consider are intrusion detection and prevention measures, vulnerability assessment, and employee security training. The scope of these methods is too large to be included in this brief, but there is ample information about these practices on the web. Look for the term "defense in depth" in your research.
Another best practice is to set up servers that perform only one service and lock down or "harden" them against breaches. For example, a web server can be hardened and allowed only to serve web pages, and a database can be hardened to only perform database functions. When you mix a web server and a database server together on one box, a hacker has the opportunity to breach your database server by hacking through the web server's vulnerabilities.
Few compliance regulations have gotten the attention that SOX, or SarBox, has received in the industry. A congressional regulation passed in 2002 as a result of the massive collapse of Enron, the bill requires that subjected companies place auditable controls on key points in their process that might affect the accuracy of their financial reporting. It also provides a key deterrent for executives who knowingly provide data that is negligently false or inaccurate: jail time.
Currently SOX is required only for public companies. Even though it does not apply to nonprofit organizations at the moment, it is important to note, since many nonprofit organizations are preemptively adopting some of the practices and new regulations may be passed in future years for the sector. Read the Sarbanes-Oxley Act.
PCI compliance refers to the Payment Card Industry Data Security Standard published in 2004. Organizations that accept or store credit card data are required, by their processors, to be PCI compliant. To be compliant you must patch your systems regularly, conduct vulnerability scans, and perform an official audit at least annually. Read more information on PCI.
The Health Insurance Portability and Accountability Act was enacted in 1996. There are many sections in the act, but the two most important to this discussion are the "Privacy Rule" and "Security Rule" added in 2003. The "Privacy Rule" states that individuals have a right to access their "personal health information," a right to have that information kept confidential, and the right to request an audit of its use. The "Security Rule" complements the privacy rule as it lays out standards for control and administration of individuals "personal health information." Safeguards are identified in the areas of Administration, Physical, and Technical. You must put policies and procedures in place and audit their use to be in compliance. Read more information on HIPAA.
SAS 70 (Type I and II)
Statement on Auditing Standards 70 is an auditing standard issued by the American Institute of Certified Public Accountants. SAS 70 standards are usually applied to external service providers to ensure that their operational policies, procedures, and controls protect the controls and integrity of the services they host.
Type I SAS 70 compliance is an auditor's assessment of the effectiveness of the design of the operational policies, procedures, and controls that the entity has put in place. Type II includes the same evaluation as Type I and adds an audit of whether the policies, procedures, and controls were in use at the time of the audit and whether they were effective. Read more information on SAS 70.
In part one we discussed the technical components to remote access. Now read Remote computing, Shutterstock