Electronic Health Records and the Cloud

Last year I was recruited to find an Electronic Healthcare Records System (EHR) for a doctor who had just gone through a foiled implementation. I am always intrigued by being exposed to new sectors of technology and learning systems inside out.

The existing EHR system had a hardware failure and the vendor was asking for over $10,000 to recover the patient data. This combined with high maintenance and licensing fees proved to be too much for the doctor.

A consultant came in and sold the doctor on a hosted EHR system he had developed, unfortunately expectations were not set and the doctor was expecting his patient data to be available on this new system. Once it became apparent that there would be an additional cost in the thousands to recover and import this data into the new system the relationship went south.

This particular project was not only a technical but also a customer service challenge. Right from the start I made sure that the expectations were set and began looking at the possible solutions.

Amongst the many options available including traditional vendors, open source, home-grown systems, etc. (Tolven Healthcare, PatientOS, OpenEMR, Clearhealth, Abraxas, Medworks & Pulse)

I was looking to implement something that not only met the requirements (demographics, Medical history, Medications & allergies, Immunization status, Laboratory test results, Radiology images and Billing) of the client but was also scalable as a potential business. I ruled out the traditional EHR systems because of their high capital expenditure, ongoing costs, and approved VAR requirements. The open source solutions seemed very attractive but I was looking for something that did not require an on-site server thus it had to be hosted and using the cloud made it scalable.

So it came down to hosting an open-source package or using someone who had already done the legwork and I didn’t want to support this long term so the search turned 2 or 3 new cloud service providers of which only one I found to be mature enough to recommend; Practice Fusion.

Practice Fusion provides a free, web-based Electronic Medical Record (EMR) system to physicians. With charting, scheduling, e-prescribing, billing, lab integrations, referral letters, unlimited support and a Personal Health Record for patients, Practice Fusion’s EMR addresses the complex needs of today’s healthcare providers and disrupts the health IT status quo.

Although this did not turn out to be a passive income generator which I always have as a goal, it turned out to be a very educational and the platform for other ideas and projects.

Enhanced by Zemanta

PBX in a Flash with CBeyond

Last week I deployed a PBX in a Flash system using SIPConnect from CBeyond. It was so successful that I will start using PIAF in lieu of Trixbox from now on for all future deployments of this type and will replace my home PBX to take advantage of Skype and Google Voice integration.

In this case I used the Aastra 53i (English edition) VoIP phones which when connected to the network, retrieved an IP from the DHCP server, contacted the PBX using mDNSResponse, checked and downloaded the most recent firmware available on the PBX, and downloaded the default configuration which prompts for a user to login. After login in the phone created a config file on the PBX for future restarts.

These Aastra phones come in 2 editions (The English/American edition and the European edition). The power supply for the European edition has different connectors and the display had symbols instead of words. Apart from that they appeared to be identical but getting the European edition to automatically connect to the PBX and configure itself was very painful, having to reset the phone to factory defaults and erase the local configuration multiple times and finally having to define on the phone the TFTP server (PBX) IP address for it to download the configuration.

Two thumbs up for the PBX in a Flash (PIAF) developers who have done a superb job with this distribution holding up the ideals of the original Asterisk@home open source project.

pbxinaflash

Their documentation was almost flawless although it was difficult trying to find the most recent version of instructions as they are all layed out in bits and pieces across a blog. In pursuit of a perfect install I narrowed down the install to running the iso install, going through the online download and compilation of asterisk and running the update/fix scripts. Now before upgrading/installing any module or OS updates, I downloaded and installed the files necessary to deploy the Aastra phones which is also done by a script and then I proceeded to install/update the software via the FreePBX module admin and finally the OS updates.

Below is the trunk configuration for connecting via SIPConnect to CBeyond from PBX in a Flash:

Outbound caller ID: 5551231234
Never overrride caller ID: checked
Maximum Channels: 6

Outbound Settings

trunk name=cbeyond

allow=ulaw&alaw&gsm&ilbc&g726&adpcm
context=from-trunk
disallow=all
dtmfmode=auto
fromdomain=sipconnect.dal0.cbeyond.net
host=sipconnect.dal0.cbeyond.net
insecure=very
outboundproxy=sip-proxy.dal0.cbeyond.net
qualify=250
secret=[secret-password]
type=peer
username=5551231234

Regitration String: 5551231234:secret-password@cbeyond/5551231234

Note: Notice there is no inbound settings required. DID incoming configuration will determine were each channel from the trunk will ring.

[ad]

Reblog this post [with Zemanta]

Back to Blogging

Its been a while since I blogged as I have been spending a lot of time looking for an angle to take advantage of the current economic crisis. There is little doubt in my mind that this is a prime time to do something so I have been working on generating passive income targeting small businesses on reducing their operating costs and product development which I hope to have something solid within the next four weeks.

I really shouldn’t feed my ego this way, but I can’t avoid to mention that a specific post on the Conficker virus has brought my stats to over 100 visitors on a consistent daily basis to my blog.

With this in mind I intend to continue to blog about security as well as some demo/reviews I will be doing over the next following weeks on several products that I believe are industry leaders. Among these products are SSL VPN appliance from Juniper and its open source counterpart, Tipping Point Intrusion Presention System (IPS) and its open source counterpart, F5 Networks Link Controller & Local/Global Traffic Manager and Riverbed’s Stealhead Appliance for Application Acceleration and WAN Optimization.

Reblog this post [with Zemanta]

Web Conferencing With Dimdim

For a while I’ve been wanting to write several articles on the power of open source and its potential covering multiple software applications that I have run into and this is definitely on of those cases.

In this economical downturn, the use of open source will be more attractive than ever as a strategy to keep costs under control when being asked to do more with less.

This industry was defined and dominated by a company called Webex in the mid nineties which was later acquired by Cisco Systems. Although a very powerful application, it remained accessible to only those who could afford its high price tag.

Over the years several companies tried unsuccessfully to dethrone Webex, which remained intact most probably due to its reliability and stability.

In 2004, Citrix Systems brought the capability of performing web conferencing to the desktop cornering an untapped consumer/smb market and reigning king.

At the time GoToMeeting emerged, WebEx, LiveNote and others catered mostly to large corporations and sales divisions, entering in six-figure contracts. Citrix Online released GoToMeeting on an “all you can meet” basis, with one monthly (or annual charge) based on the number of authorized hosts. This pricing model was unique at the time, but has since been copied by competitors.

Late 2006 I started looking at open source alternatives to the Webex’s of the world and stumbled upon Dimdim while browsing through the goldmines of Freshmeat and Sourceforge.

The software at that point was still in alpha version 1.6. Installation was pretty straight forward once tomcat was installed and a plus was the possibility of integration with Moodle, an open source Course Management System (CMS).

Unfortunately the stability of the package was not there. Another package I looked at was Yugma which is a web based web conferencing service. Again it just wasn’t there.

Two years later and Dimdim has gone from Alpha to Beta and now Dimdim has exited Beta with version 4.5.

Dimdim‘s installation is far more complicated than earlier versions requiring several Python packages, and building and compiling other applications that support Dimdim. My first attempt at performing the installation was unsuccessful but a VM Appliance which is also provided under GPL3 license came up without a hitch.

The web service Dimdim works right out of the box and appears to be reliable and stable. Scalability will be my next test on this VMware appliance with 1Gb of RAM, to determine if it can handle 2-3 conferences and upward of 50 users.

Promising features include integration with other open source industry leaders.

Dimdim’s commitment to open source software development is supported by integrations with industry-leaders:

  • Zimbra: Dimdim now offers a free zimlet for Zimbra’s open source email system;
  • Moodle: Dimdim is integrated with version 1.9 of Moodle’s Course Management System;
  • SugarCRM: Dimdim is integrated with the leading open source customer relationship management system,
  • Claroline: Dimdim is embedded within with the collaborative learning environment.

[ad]

Trixbox 2.6 and Sangoma Hardware

Trixbox (formerly Asterisk At Home – A@H) has definitely come a long since its beginnings in November 2004 and since I started playing around with Asterisk 2 months earlier. The convenience of being able to download an ISO and have a functional PBX in less than an hour was and is amazing.

An excellent resource is Ward Mundy’s blog Nerd Vittles, which I have also followed since early 2005 and has worked on some very cool and interesting projects augmenting Asterisk functionality. Most recently in November 2007, they released PBX In A Flash (PIAF) and have also announced a under $500 appliance with PIAF running on it.

What is Asterisk?

Asterisk is a software implementation of a telephone private branch exchange (PBX) originally created in 1999 by Mark Spencer of Digium. Like any PBX, it allows attached telephones to make calls to one another, and to connect to other telephone services including the public switched telephone network (PSTN) and Voice over Internet Protocol (VoIP) services. Its name comes from the asterisk symbol, “*”.

What is Trixbox?

Trixbox is a turnkey business class PBX voice communication system based on the Open Source Asterisk project. It’s no longer necessary to pay thousands and thousands of dollars for a proprietary phone system. By simply downloading software and installing it on a low end system you can have a powerful, open, and robust pbx system. From small systems with only a couple analog phone lines and extensions to large installs with multiple T1/E1 connections and hundreds of extensions, you can easily use Trixbox to meet your telephony needs.

I believe Trixbox to be the most complete distribution of Asterisk out there, although many of its features might not be used in many cases. On the other side I have heard complaints on the lack of collaboration in adding new features and fixing bugs by the guys at Fonality, which makes it less open as it were.

Parts List:

  • Dell GX-150 with 512MB and 80Gb
  • Sangoma A200 card with 4 FXO ports

Todo List:

  • Upgrade the RAM to 512Mb and the hard drive to 80Gb
  • Install the Sangoma PCI A200 card
  • Insert CD into CD drive and boot from disk
  • Go through wizard and install Trixbox
  • Login to the computer, update Cent OS and download and install the drivers
    • yum update
    • yum upgrade
    • cd /opt
    • wget ftp://ftp.sangoma.com/linux/RPMS/2.6.1.13/wanpipe-util-3.2.7.1-0.i686.rpm
    • wget ftp://ftp.sangoma.com/linux/RPMS/2.6.1.13/wanpipe-modules-2.6.18-53.1.4.el5-3.2.7.1-0.i686.rpm
    • wanrouter hwprobe
    • wanrouter hwprobe verbose
    • setup-sangoma
      • When asked which codec will be used, select MULAW – North America
      • When configuration of the analog card completes, select 1 to continue
      • When configuration of Zaptel and Wanpipe completes, select 1 to save and restart deamons
      • When asked to start wanrouter at boot time, select 1 for yes
    • ztcfg -vv (to display the analog card installed and its modules.)
  • Install DynDNS client:
  • Create DynDNS account
  • Configuration ddclient: (Add to the end of the /etc/ddclient/ddclient.conf file)
    • use=web, web=checkip.dyndns.com/, web-skip=’IP Address’
    • server=members.dyndns.org,     
    • protocol=dyndns2,      
    • login=your-login,       
    • password=your-password       
    • pbx.dnsalias.com

Trixbox links to several good quick install guides here and a comprehensive list of documentation here.

[ad]

Educause 2008

This years Educause conference took place in Orlando, Florida.

Educause is a nonprofit association whose mission is to advance higher education by promoting the intelligent use of information technology. Membership is open to institutions of higher education, corporations serving the higher education information technology market, and other related associations and organizations.

The association provides a social networking Connect site that supports blogs, wikis, podcasts and other platforms for IT professionals to generate and find content and to engage their peers; professional development opportunities; print and electronic publications, including e-books, monographs, and the magazines Educause Quarterly (EQ) and Educause Review[1]; strategic policy advocacy; teaching and learning initiatives; applied research; special interest discussion groups; awards for leadership and transformative uses of information technology; and a Resource Center for IT professionals in higher education.

Major initiatives of Educause include the Core Data Service, the Educause Center for Applied Research (ECAR), the Educause Learning Initiative (ELI), Net@EDU (advanced networking), the Educause Policy Program, and the Educause/Internet2 Computer and Network Security Task Force. In addition, Educause manages the .edu Internet domain under a contract with the U.S. Department of Commerce.[1]

The current membership of Educause comprises more than 2,000 colleges, universities, and educational organizations, including 200 corporations, with 16,500 active members.

Below are pictures from the conference:

[slickr-flickr tag=”educause 2008″ id=”61116089@N00″ group=”n”]

My schedule at the conference:

Tuesday, October 28, 2008

Wednesday, October 29, 2008

Thursday, October 30, 2008

Friday, October 31, 2008

Overall I thought it was an excellent conference, there weren’t as many people this year as previous ones.

The exhibit hall was fun as always. Some exhibits were great and others sucked which brings up another subject. Marketing.

There were two exhibits that stood out amongst the crowd. The first one from Bradford Networks and the other from Trapeze Networks. These guys not only gathered leads, but engaged their prospective customers allowing them to deliver their sales pitch. Two companies that I will definitely be following up with.

Other companies that did well on their marketing pitch were Turning Technologies, Novell, CDW, Zimbra, Elluminate, and Microsoft. Although the only thing Microsoft had going for itself was as great demo on a smart-board of Image Composite Editor.

Microsoft Image Composite Editor is an advanced panoramic image stitcher. The application takes a set of overlapping photographs of a scene shot from a single camera location and creates a high-resolution panorama incorporating all the source images at full resolution. The stitched panorama can be saved in a wide variety of formats, from common formats like JPEG and TIFF to multi-resolution tiled formats like HD View and Silverlight Deep Zoom.

The things that characterized the good exhibits can be summarized in a few words. They were accessible, had an inviting environment, gave away free stuff (like free iTouch and laptops every hour) and had either professionals or very seasoned sales people giving the presentations.

On the other side of the coin, were the very big and expensive exhibits which just didn’t deliver.

Some that deserve mention are AT&T which has a very expensive three environment exhibit representing campus life and U-Verse all over the place. Alcatel-Lucent had a not very inviting exhibit and their staff sat down most of the time. Citrix was just offering a $5 Starbucks card for filling out a survey. Cognos had a closed exhibit that wasn’t inviting to anyone.

Its not that these companies were cheap, which they were; but they are spending a lot of money for lead generation when they could also be qualifying the leads and delivering their product demos to a captive audience.

[ad]

Cloud Computing – Made Simple and Affordable

Depending on how many people you ask to define the meaning of “Cloud Computing“, you are very likely to get the same numbers of answers.

Cloud Computing builds on decades of research in a number of computer science fields including grid computing, distributed computing, utility computing and more recently networking, web and application services.

It implies a seamless Service Oriented Architecture (SOA); basically the delivery of an integrated and orchestrated suite of on-demand services to an end-user through the grouping of functionality around business processes, making them accessible over a network and allowing these services to communicate with each other by passing data from one service to another in a loosely coupled manner.

This concept built upon and evolving from older concepts of distributed computing and modular programing, promises to reduce information technology overhead, virtualization of resources, greater flexibility, and lower total cost of ownership. (TCO)

A group from North Carolina State University and George Mason University, presented this year at Educause 2008 in Orlando, Florida, a full-day seminar on “Cloud Computing Made Simple and Affordable”.

Since the year 2004 they have been hard at work building the Virtual Computing Lab (VCL), a new, scalable and accessible computing system architecture.

High costs, support and security issues, software licensing, space requirements, and demands for enhanced local and remote 24 x 7 user access constantly challenge computing in education. The Virtual Computing Lab (VCL), a new, adaptable, and open source approach to computing, provides a cloud-like rich services computing environment to serve advanced research and student computing simultaneously and affordably, within a scalable and accessible system architecture. The VCL maintains the diversity and flexibility essential to an academic environment while providing computational resources with an unprecedented lack of restrictions and significant reduction in costs. The VCL is an Internet-based service that allows users to augment their own computers of varying types and capabilities—without their having to acquire new or uniform computers, install and run advanced software, provide their own software support, and so forth.

The speakers at the session included Samuel F. Averitt (NCSU), Aaron Peeler (NCSU), Sharon P. Pitt (GMU), John Savage (GMU), Henry E. Schaffer (NCSU), Sarah R. Stein (NCSU) and Mladen A. Vouk (NCSU).

The open-source project has been submitted and recently accepted here by the Apache Foundation as one of its Incubator Projects.

VCL relies on the LAMP architecture, which includes Linux, Apache, MySQL and PHP and it was originally developed in a blade environment using IBM blades and xCAT, which is a scalable distributed computing management and provisioning tool that provides a unified interface for hardware control, discovery, and OS diskful/diskfree deployment.

VCL provides a web 2.0 reservation system, making accessible a multitude of hardware and virtualized systems running a variety of operating systems and applications to the end-users via Remote Desktop Protocol (RDP), for a pre-determined period of time. Images for these systems are maintained online or offline dependant on a last-used/commonly used algorithm, so an image offline could take up to 15 minutes to load.

Not only does this approach address the issue of providing users access to applications without the need for an installation, but also by making use of virtualization technologies such as VMware ESXi Hypervisor, provides the capability of multiplying by a substantial factor computing power while reducing the total cost of ownership.

Going even further, computers not being used could be aggregated to the cloud, making them all that valuable.

[ad]

Poor Man’s Disaster Recovery

Backups are probably the most tedious, time consuming jobs for a system admin and often regarded as a low priority until something goes wrong.

Hell breaks loose and you stumble around for tapes, building catalogs, restoring data, finding unusable tapes or corrupt data and looking for excuses or stories to tell management.

Last month I discussed personal backups and disaster recovery here.

I have added to my arsenal of tools an application called SyncBack which I run at the least every couple of days on all my data including the “My Documents” and “My Documents and Settings” folders making sure I have my data and settings backed up to an external USB drive.

I also use Mozy to have a historical backup of critical files, which has come very handy. Mozy provides 2 Gb for free of backup and have paid plans for additional storage. A client is installed on the computer and pretty much takes care of everything once its configured. Other players in this area include Carbonite, HP Upline, IDrive, SOS Online Backup, and Symantec Online Backup.

Disaster recovery is not about backups and what the quickest way to restore those files are, but rather to plan for the worst and how will you continue to operate if the unforseeable happens.

In a small business for example, its rare to have more than a server which serves as a print server, a file server, an e-mail server, a blackberry server, an application server, etc, etc. Even if there is another server or two they are all running several applications, so redundancy is not something that’s viable nor affordable for a small business.

OK. So backups are getting done. Whether they are being backed up online, to tape locally or to disk. You want a quick restore, then go for disk over tape.

Everything is Kosher….. not so fast.!

What would happen if the server had a major failure? Not something quickly addressed by ordering a replacement part. Could you put your clients on hold for a couple of weeks until a new server arrives?

What if there was a fire? What if someone broke in and stole the server?

That small business would most probably cease to exist if its operations depended heavily on the use of technology.

The same principles used in bigger businesses when it comes to disaster recovery, appear to be more critical to smaller businesses. Having a disaster recovery site where the server could be mirrored in the event of a loss.

What a better place than the small business owner’s home.?

So the challenge is to mirror a server located at the office with a server located at home. Sounds like something definitely out of reach for a small business, since it involves possibly duplicating licensing costs, software costs for mirroring and then there’s the issue of dealing to the caps for uploads on almost any broadband provider, which generally puts the bandwidth available at 512k or less.

rsync is a software application for Unix systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate. This program is ideal since it reduces the data transferred to a minimum over a limited link.

DeltaCopy is an open-source backup program port of rsync to Windows. It has several features which make it ideal for the task at hand including installs as a service, incremental backups, task scheduler, and e-mail notification.

DeltaCopy is installed on both the main server and the backup server. The backup server is configured with DeltaCopy running as a service and if encryption is required, a tunnel over ssh can be accomplished by installing an ssh-server using Cygwin for emulation.

The backup server will require DynDNS to make sure that the main server can reach the backup server by name. A couple of ports (873 (rsync) and 22 (ssh)) will also need to be forwarded on the DSL/Cable router on at the backup server side.

Then schedule and sleep well knowing you have a “Disaster Recovery” plan.

Resources:

How to install a ssh server
Set up a personal, home SSH server

[ad]

My Family Tree

Ever since I was a child I have been fascinated by Genealogy (the study of families and the tracing of their lineages and history.)

In the late 80’s I made my first attempt to compile my family tree using Clarisworks for Macintosh. After more than ten years, trying to get the non-standardized data from that program and damaged ZIP drives is almost impossible.

I started collecting data again about 3 years ago, after evaluating several offering including several desktop applications including the widely used Family Tree.

I came to the conclusion that I didn’t want to do this all by myself. I needed to enlist my family to gather and enter this information, so a web based application that allowed anybody to participate by entering information was ideal.

PhpGedView is an Open Source web application that runs on php with a MySql backend allowing more than one person to contribute to the family tree, while at the same time having the administrator approve content going into the family tree. Features include charts and lists, PDF reports, visitors and users options, GEDCOM 5.5 Support.

With everybody leading busy lives, its hard to get people to participate. Its just not fun entering information about your family into a site leading to my disappointing 60 entries.

Several months ago, I created a family group within Facebook and that group has almost reached 200 people as I write this entry in my blog. Having a hip web app within Facebook to gather this data would be so much effective.

There are now several Web2.0 startups that target this market. Geni, MyHeritage and Kindo, which MyHeritage recently swallowed.

Even though MyHeritage has more users and traffic, Geni has the Facebook application which would make it real easy when sending an invite to the group.

Only issue remaining of course is privacy. How do both of these companies deal with the privacy of all this data.

[ad]

RedHat Local Update Server – Current

Managing more than 40 Redhat servers became a hassle and now expensive with redhat charging obscene amounts of money for their RedHat Network update service. Opensource to the rescue!!! Found out about a project called Current that allows you to have a local update server that will connect to RedHat to download updates and then host updates for all the other servers. Only pay for one subscription if you do not want to compile the source which is available for free and save bandwidth.

MORE…

The whole process was rather confusing, so I decided to put together something on this. The central server and the client are running RedHat Enterprise Linux AS 4 and the version of current used was Current 1.7.2.

Installation Server-side:

Installed the rpm. –> rpm -ivh current-1.7.2-1.noarch.rpm

Edit the /etc/current/current.conf file to your liking. Making sure you have a directory where the database will be held.

Run: cinstall create_apache_config –> This will create configurations file in the /etc/httpd directory.

Run: cinstall create_certificate –> This will create three files in the /etc/current directory. The server.crt will need to be moved to /etc/httpd/conf/ssl.crt/ and the server.key will need to be moved to /etc/httpd/conf/ssl.key/ . The last file “RHNS-CA-CERT” will be moved to /usr/share/rhn and named CURRENT-CA-CERT.

Run: cinstall initdb –> This sets up the database schema. Restart Apache and current should be operational. If you face any problems up to this point then you might need to check for python or database updates.

Run: cadmin create_channel -r ‘release’ -a ‘arch’ -l ‘label’ -n ‘name’. This will create a channel for you. (cadmin create_channel -r 4AS -a i386 -l 4AS -n i386-redhat-linux)

Run: cadmin add_dir -l ‘label’ -d ‘dir’ –> Adds a directory to the channel specified by label. (cadmin add_dir -l 4AS -d /var/spool/up2date)

At this point you should have the local update server pulling updates from RHN and keeping copies of the rpms in /var/spool/up2date. This can be configured by running #up2date –configure.

Installation Client-side:

Copy certificate file CURRENT-CA-CERT to /usr/share/rhn
Edit file /etc/sysconfig/rhn/up2date

o Change sslCACert=/usr/share/rhn/RHNS-CA-CERT
o to sslCACert=/usr/share/rhn/CURRENT-CA-CERT
o
o Change serverURL=https://www.rhns.redhat.com/XMLRPC
o to serverURL=https://server.somedomain.com/XMLRPC
o
o Change noSSLServerURL=http://www.rhns.redhat.com/XMLRPC
o to noSSLServerURL=http://server.somedomain.com/XMLRPC

A cron job needs to be set-up to run on a daily basis to check for updates and install them (excluding kernel updates).