Friday, October 28, 2005

The open eleven steps to telecommuting

I have set up and supported remote sites and home based telecommuting. Listen to my advice, listen very carefully and save your sanity.

If your organization is large enough then it is likely that you will have a few older desktop PCs that have been or are due for replacement during an upgrade cycle. PCs that are inadequate for Microsoft XP and Office2003 are more than powerful enough for many current versions of Linux, especially for the role of server. Also second hand PCs with the required specifications are very cheaply acquired.

1) Find an older PC, at least a PII 300 with 256 MB memory, to set up as a headless ( no display or keyboard ) server and firewall. A simple web based interface ( or even an external hardware push button ) can be used by the local users to start/stop the server and internet connection. All other maintenance should be handled remotely via ssh, webmin and VNC.
2) Install a second NIC or connect the modem directly to the server. Connection to the Internet should be through the server and connection to the Office should be through a VPN on the server. Use a dynamic IP service for each site so you can remotely log on to the local server via ssh.
3) Install a new IDE hard drive in a 3.5" removable rack and tray. The drive should be than big enough for the operating system (Linux of course) and copies of some of the local desktop partitions. A telecommuter can shut down the server and bring in the drive during the day to resync and repair.
4) Install a DHCP demon on the local server to allocate local IP addresses, DNS and gateway settings. If the desktops are network boot capable then install TFTP to remotely boot and use Knoppix via PXE and the network. If the desktop OS is constantly crashing, or is infected by malware, the user can select PXE/network boot via the BIOS, and boot into Knoppix. The user can then be instructed over the phone to enable the ssh server to allow remote scan,repair and reimaging of the desktop partitions. The user can use the Knoppix desktop to continue working with full access to files while the the remote administrator fixes/reimages the drive in the background.( Consider hiring someone who knows how to customise Knoppix or another live Linux system for your setup )
5) Partition the desktops with as small as required C: partition ( or in the case of Linux the root partition ) for software. When software is install, use dd and netcat via live Knoppix to copy/clone a snapshot of the partition to the server. You can allocate the remaining free space as a persistent partition where documents are stored.
6) Install and enable remote VNC service on all the platforms, but only allow incoming connections from the local server ( which is redirected over a SSH tunnel ).
7) For local backup, create share directories on the desktop accessible by the server. On the local server create loopback encrypted file systems, unmount and copy the images to the desktops shares in chunks, using redundancy if enough space is available on the desktops. Checksum ( MD5 is enough ) each piece.
8) If the network load to the Office is taking up all the available internet bandwidth or the connection is just too slow then install proxy servers on the local server. You can also consider using a distributed filesystem ( OpenAFS is still the best ) with access to the local users via a SAMBA share.
9) If phone charges are eating into the budget, and the internet connection is good enough, then install Asterisk on the local server ( upgrade the server to a Celeron 800Mhz or better ) and PCI cards with enough FXS ports for each local user. Don't bother with software based phones/headsets. The phone will work when the desktop does not.
10) Set up a Linux server at the Office that operates as a thin client application server. Allow remote access though both FreeNX and VNC. Create login accounts and logins that operate as virtual meeting rooms, with multiple users logging in via VNC. Use VNCserver with a screen size of around 1000x600, that will operate via a VNC viewer on any 1024x768 desktop. Use phone based conference calling for voice -- it's a lot less hassle for the users
11) Add the usual list of cross platform applications: Firefox, Thunderbird, Gaim, and even OpenOffice etc.

The return on investment from the reduction in desktop downtime will quickly outweigh any initial outlay for any new hard drives and possibly FXS cards.

Sunday, October 09, 2005

Our Data:an appeal - a "Plimsoll line" for apps

From June 14 2002

However relatively bad the security of Microsoft's products are in comparison to what the free licensed and open source communities ( as well as practically every other vendor on the planet ) provide, Microsoft is not alone in the presence of vulnerabilities, this is a major issue for Linux/BSD and Unix as well as ever other OS and vendor.

From the Plimsoll Club history

Samuel Plimsoll brought about one of the greatest shipping revolutions ever known by shocking the British nation into making reforms which have saved the lives of countless seamen. By the mid-1800's, the overloading of English ships had become a national problem. Plimsoll took up as a crusade the plan of James Hall to require that vessels bear a load line marking indicating when they were overloaded, hence ensuring the safety of crew and cargo. His violent speeches aroused the House of Commons; his book, Our Seamen, shocked the people at large into clamorous indignation. His book also earned him the hatred of many ship owners who set in train a series of legal battles against Plimsoll. Through this adversity and personal loss, Plimsoll clung doggedly to his facts. He fought to the point of utter exhaustion until finally, in 1876, Parliament was forced to pass the Unseaworthy Ships Bill into law, requiring that vessels bear the load line freeboard marking. It was soon known as the "Plimsoll Mark" and was eventually adopted by all maritime nations of the world.

The risks,issues and solutions for providing a more secure operating and application enviroment have been known for decades.

Those who do not already comprehend the issues and are willing to learn, should take some time out to listen to some of the speeches at Dr. Dobbs Journal's Technetcast security archives, starting with Meeting Future Security Challenges by Dr. Blaine Burnham, Director, Georgia Tech Information Security Center (GTISC) and previously with the National Security Agency (NSA)

The design and implementation of some applications and servers are just too unsafe to use in the "open ocean" of the internet.

Numerous security experts have railed against Microsoft's lack of security, best summed up by Bruce Schneier Founder and CTO Counterpane Internet Security, Inc who rightly said:

Honestly, security experts don't pick on Microsoft because we have some fundamental dislike for the company. Indeed, Microsoft's poor products are one of the reasons we're in business. We pick on them because they've done more to harm Internet security than anyone else, because they repeatedly lie to the public about their products' security, and because they do everything they can to convince people that the problems lie anywhere but inside Microsoft. Microsoft treats security vulnerabilities as public relations problems. Until that changes, expect more of this kind of nonsense from Microsoft and its products. (Note to Gartner: The vulnerabilities will come, a couple of them a week, for years and years...until people stop looking for them. Waiting six months isn't going to make this OS safer.)

However Microsoft's products are not alone in the presence of vulnerabilities, this is a major issue for Linux/BSD and Unix as well as any other OS and vendor.

In a recent speech "Fixing Network Security by Hacking the Business Climate", also now on Technetcast, Bruce Schneier claimed that for change to occur the software industry must become libel for damages from "unsecure" software. However, historically this has not always been the case, since most businesses can insure against damages and pass the cost along to the consumer.

The Ford Pinto and more recently the Ford Explorer's tires are two examples of public and media pressure being more successful than just threat of lawsuits. Even so, just as with the automotive industry, eventually though public pressure the governments around the world have to step in and pass regulations that set up a minimum set of requirements an automobile has to meet to be deemed "road worthy". This includes crash testing as well as the inclusion of safety equipment on all models. The requirement are not constant and change to meet the expectations and demands of the public and lawmakers.

The onus is not only on the automotive industry itself but also on the users. Most countries require that all automobiles undergo regular inspection and maintain an up to date "Warrant of Fitness".

In the same way, if you want a secure IT infrastructure, eventually the software design, implementation and each deployment will have to undergo the same type of regulation and scrutiny.

Unix,Linux,BSD and especially OpenBSD are currently far superior in terms of security, both in closing the vulnerabilities in applications before they have the chance to be widely exploited and implementing more secure access subsystems ( SELinux/LSM etc ).

However, should the Unix, open source and free licensed communities and vendors be taking a more active approach, including lobbying government, to
1) set up a minimum set of expectations, in the design and implementation of internet "accessing" software ; and
2) ensure that all deployments are more securely implemented ; and/or
3) remove inherently unsecure products from the marketplace,

IMO the above three are preferable to all software vendors, including Microsoft, than attempts to allow liability lawsuits against vendors for deployments which the vendors do not necessarily have any control over.