Monday 28 February 2011

The Japanese Congress and the Global Health Forum

This week,  I traveled to Japan as part of a 'US-Japan health care policy dialogue' a partnership between the Center for Strategic and International Studies (CSIS), a Washington-based foreign policy institute, and the Health and Global Policy Institute (HGPI), based in Tokyo.  This collaboration between American and Japanese experts focused on critical areas of innovation and reform in the health sectors of both Japan and the United States - initially payment systems and healthcare IT.  Over the next 6 months, we'll complete an analysis with actionable policy recommendations.

As part of the effort, I provided testimony to the Japanese Congress (Diet) and joined an all day Global Health Forum organized by the Health and Global Policy Institute (HGPI), a leading Japanese think tank.

The Congressional experience was interesting.  Japan has the longest lifespan of any country in the world, has comprehensive healthcare coverage for all citizens, and has very low healthcare costs - less than half the US expenditure per person per year.   It's challenging to highlight lessons learned from the US which has highly variable quality, high cost, and 40 million uninsured.

Luckily, the Japanese agreed that Healthcare IT is to be embraced for quality/safety/efficiency, cost reduction, and job creation.

I described the US Healthcare IT program as guided by 5 goals

*Improving quality, safety, efficiency, and reducing health disparities
*Engage patients and families in their health care
*Improve care coordination
*Improve population and public health
*Ensure adequate privacy and security protections for personal health information

achieved with 5 tactics

* Policy (Health Information Technology Policy Committee)
Certification and Standards (Health Information Technology Standards Committee)
Privacy and Security Tiger Team
Regional Extension Centers and Health Information Exchanges ($2 billion)
Incentives to adopt and achieve “Meaningful Use of Electronic Health Records” ($21 billion)

The Japanese legislators asked great questions about the role of genomics, the role of telemedicine, and the potential for job creation.   I remain optimistic that the Japanese will consider their own healthcare IT stimulus program.

The all day Global Health Forum included several important key points:

In Japan over the past 50 years, the economy has shifted from agricultural to industrial, from rural to urban, and from communities bonded together to often impersonal cities without support systems of the family and friends.   Many Japanese die alone and do not have the eldercare they need.

25% of Japanese are over 65 and over the next 50 years, the problem will get worse, such that 2 working age individuals will be paying for the care of 1 retirement age individual.   The Japanese birth rate is 1.3, so the Japanese population will fall rapidly over the next 50 years, reducing the workforce and tax base.   Immigration is very limited in Japan, so diminishing Japanese and limited foreign workers with result in a crisis of public funding for healthcare .    The Japanese will try to balance cost, quality, and healthcare access with available funds, but even now there is gap between the funds received from workers and the funds paid out to pay for the care of the elderly.

IT can provide some mitigation of the problem.   Japan has one of the best wired and wireless networks in the world.   These can be leveraged to create virtual communities/social networks of carers as well as support homecare including telemedicine and remote monitoring.   IT can provide data for population health and care coordination.

At present Japan has many policies which discourage the use of the public internet for healthcare, data exchange, and homecare. Changing policy/regulation and providing incentives to move care to the home is an important next step.  Focusing on wellness and day to day life rather than just treatment of disease is also an important tactic.    Japan describes this as a transition from "medical policymaking"  to a "health policymaking".

The session on healthcare IT included my presentation, a presentation from Dr. Akiyama of Tokyo University, and a presentation from Intel.

Many of themes in Japanese society apply to the US.   Our aging baby boomers will require more care than the Medicare system can afford.    Secretary Sebelius has said that 1/3 of US healthcare is redundant and unnecessary.      I look forward to continued exchange of ideas between the US and Japanese.  We are meeting again this July at CSIS in Washington.

Friday 25 February 2011

Cool Technology of the Week

As I prepared for my trip to Japan this week, I had the opportunity to discuss Japan's overall IT strategy with several corporate, government, and academic leaders.

The three major goals include

1.  Providing enhanced IT services to citizens
2.  Using IT to enhance communities
3.  Create new markets and businesses

National healthcare IT goals include creating a national framework supporting personal health records,  enhancing telemedicine capabilities, and building population health measurement resources to enhance quality, safety, and efficiency.

Here's a full text report and an executive summary.   It's great reading and provides powerful insight into the elements of IT and Healthcare IT that are important to Japanese Society.

To me, supplementing our own policy and technology thinking by reading the plans from other countries is always very cool.

Thursday 24 February 2011

Regression to the Mean

You may have heard about the Sports Illustrated Effect, the notion that people who appear on the cover of the magazine are likely to experience bad luck, failure, or a career spiral.

Over the 30 years of my own professional life, I've watched many colleagues become famous, receive significant publicity, then fail to live up to the impossible expectations implied by their fame.   They regress to the mean.  Nature seems to favor symmetry.   Things that rise slowly tend to decline slowly.  Things that rise rapidly tend to drop rapidly.

Fame is usually a consequence (good or bad) of invention, innovation and accomplishment.   Fame itself is generally not what motivates a person to accomplish their feats.   An Olympic athlete is usually inspired because of a highly competitive spirit.    An inventor is usually inspired because he/she believes there is a better way.    Fame that is the consequence of a feat can affect future behavior.    It can become an intoxicant and motivate someone to strive for accomplishments that keep the fame coming.

I've thought about my own brushes with fame.

When I was 18 and started at Stanford, I realized that my scholarships would only cover the first year of tuition.   I visited the Stanford Law library, read the US tax code and wrote software for the Kaypro, Osborne 1, and CP/M computers that calculated taxes.  The software shipping from my dorm room generated enough income to start a small company.  When the PC was introduced, we were the first to provide such software to small businesses seeking to compute their tax obligations.   By the time I was 19, I moved into the home of Frederick Terman, former Provost of Stanford, and the professor who first encouraged William Hewlett and David Packard to build audio oscillators and form a new company called HP.  The story of a 19 year old running a software company and living in the basement of founder of HP was newsworthy at the time.    I did interviews with Dan Rather, Larry King, and NHK TV Japan.

The company grew during my medical and graduate school years, but as technology evolved it did not innovate to take advantage of new platforms, graphical user interfaces, or emerging networks.   I sold the company when I began my residency.   It eventually closed.

By the time it closed,  I was learning to build clinical systems and worked during residency to  develop a hospital-wide knowledge base for policies/ procedures/protocols, an on-line medical record, a quality control system, and several systems for medical education.    I achieved local fame when the County of Los Angeles voted me the County Employee of the Month, the first time it was given to a physician.

I left residency and began practice at Beth Israel Hospital while doing post doctoral work at MIT, writing a thesis about using the web to securely exchange medical records.   In 1997, using the web was considered risky, unreliable, and insecure, but the recent merger of Beth Israel and Deaconess needed a quick win, so "CareWeb" was born.   I became CIO.

In 1999, Dr. Tom Delbanco and others had the idea that patients should be able to access their own records electronically.   My team created Patientsite.   We were credited with inventing one of the first personal health records.

And the list goes on -  the 2002 network outage, early regional healthcare information exchange, harmonizing standards, creating a private cloud for health care records, and achieving hospital certification in the meaningful use process.

The interesting conclusion of all of this is that in every case, the fame was temporary, and very soon followed by regression to the mean - a stellar performance or innovation became typical/average/mundane.

It's nearly impossible to remain at the front of the race forever -eventually someone stronger, faster, or more nimble will displace you.

In my case, I stopped thinking about my own reputation and fame about 1998, recognizing that every episode of fame is followed by a decline into anonymity - the Sports Illustrated effect.   What's lasting are great organizations and teams that are constantly reinventing themselves - changing the race they are running.

Steve Jobs said "we're as good as our last product" and he's right.

If you focus on creating great organizations, which consistently achieve discrete episodes of fame but continuously innovate so that those episodes of rise and fall actually look like a continuous series of peaks, then you can beat regression to the mean.

The organizations in which I work will last for generations.  Their reputations transcend anything I will ever do personally.   My role is to champion, support, and publicize a few key innovations every year that will keep the organizations highly visible.  That visibility will attract smart people and retain the best employees who want to work for a place on a rising trajectory.   If I can transform the rise of fame and regression to the mean into a trend that feels like one organizational strength after another, I'll declare victory.

Wednesday 23 February 2011

A Mission to Japan

This week I'm in Tokyo meeting with Japanese government officials (legislative and executive branch) to share lessons learned from  US Healthcare IT stimulus efforts and plans for healthcare reform, including the IT implications.

Although the Japanese are much healthier than Americans, they have their own healthcare challenges.    Japan has an aging society, a low birth rate, disparities of care across income levels, and rising costs.   Since reimbursement is largely via government funded programs, the imbalance of those seeking care and those paying into the system will create a crisis of rising costs over the next decade.   Here's a news story from the Japan Times that illustrates the problem.

Hospitals are typically not connected to the internet because of privacy concerns.   Data is rarely shared with patients or among providers because of misalignment of incentives.   IT adoption among hospitals is highly variable.

My message to the Japanese is that  healthcare IT is one tactic that can help.

Using electronic health records provides a foundation for quality measurement, decision support, and exchange of healthcare data for coordination of care.     Incentives need to be realigned to focus on quality and outcomes rather than fees for services rendered.   Privacy policy needs to be formulated that protects confidentiality and patient preferences but also enables collection and exchange of data that fosters wellness by encouraging the right care at the right time.    The excellent work in error reduction that has permeated Japanese industry needs to be applied to healthcare.    At present, Lean/Six Sigma approaches have not been applied widely to hospital care.

A healthcare IT Stimulus program for Japan, incorporating lessons learned from ARRA/HITECH is a good first step.   By deploying technology and creating policy in parallel, the Japanese can innovate in healthcare, reducing costs and improving quality.

Tuesday 22 February 2011

The Current State of US Health Information Exchange

As Massachusetts formulates its health information exchange goals, priorities, and business models, we've been curious about plans in other states.   We've reviewed case studies, blogs, and academic papers.   However, there has not been a collected summary of the national HIE experience we could reference.

That resource is now available.

ONC released 3 valuable documents that I highly recommend to all HIE stakeholders.

A summary of the current state of HIEs in the US.

A detailed description of the models/approaches used .

Case studies of each model .

As states implement live data exchange in support of Meaningful Use stage 1,2 and 3 over the next 5 years, there will be remarkable lessons learned by comparing the relative success of these models.  It's unlikely that one size will fit all, but successes and failures will lead us to a parsimony of models which sustainably connect every provider, patient and payer in the country.

Monday 21 February 2011

The February HIT Standards Committee meeting

The February HIT Standards Committee meeting included an overview of our work assignments from from the HIT Policy Committee, the kickoff of new Quality Workgroup initiatives, a preview of upcoming medical device hearings, an update on the Direct Project,  and a rich discussion of the Standards and Interoperability Framework (S&I Framework) including what it is and what it isn't.

The meeting began with an update from Dixie Baker about the charge from the HIT Policy Committee to assist with digital certificate standardization.    In general, the role of the HIT Standards Committee in the S&I Framework is to specify the desirable characteristics of harmonized standards, do environmental scans of existing standards to provide feedback on harmonization work,  and evaluate work products of the S&I Framework, such as the Direct Project.     As a next step, Dixie's Privacy and Security Workgroup will specify the desirable characteristics of X.509 certificates that are needed for the Direct project and the Nationwide Health Information Network.

The Privacy and Security Workgroup has also been charged by the Policy Committee to assist with Provider Directory standards. Walter Suarez will lead that initiative, doing an environmental scan of existing approaches (HL7, IHE, OMG, LDAP, state HIE's, commercial solutions) and developing a list of desirable characteristics as input to the S&I Framework process.

Thomas Tsang from ONC, provided this overview of the work on quality measures to be done in support of Meaningful Use Stage 2.   The Quality Workgroup will be assigned this work.  We'll name a new workgroup chair to guide the process and we'll add additional experts to the workgroup.   As part of quality measure development, we will evaluate the burden of capturing quality data imposed on providers, workflow, and software implementers.  Exclusionary criteria that have little impact on measure performance can be especially burdensome.   The Standards Committee made a consensus statement that exclusionary criteria should be optional, implemented at the discretion of provider organizations if they feel such criteria are significant to their measure computations.   In the case of BIDMC, almost all exclusionary criteria create burden without benefit and we would elect not to include them in our calculations.

Liz Johnson described the March 28 Clinical Operations Workgroup Medical Device Hearing which will include a patient/consumer panel, a provider panel, an Interoperability/ Data Integration panel, a Data Accuracy/Integrity panel, a  Device Security/Data Security Panel and a Universal Device Identification panel.   The Clinical Operations Workgroup will also take on the charge from the Policy Committee to assist with demographic code sets and an evaluation of patient matching strategies.

Arien Malec provided an update on Direct, including a list of the existing live installations (here's an overview of BIDMC's Direct experience).  He'll return at the March 29 Standards Committee meeting to describe lessons learned from production implementations.

Doug Fridsma provided an update on the S&I Framework, including a rich discussion of several concerns about it.

The Standards Committee members recommended

*incorporating experienced experts and their lessons learned in the process

*involving the Standards Committee at the beginning of the process to specify the desirable characteristics of harmonized standards for each project.   Since the first 3 S&I framework projects are related to clinical summaries, labs, and transfers of care, the Clinical Operations Workgroup will be the initial liaison to the S&I process.

*performing a mid project review via the Standards Committee to ensure the work is on track and achieving the desired outcome

*evaluating the deliverables at the end of the process via the Standards Committee to determine if goals have been met

*leveraging the NIEM process without adopting NIEM XML or discarding current standards that are already part of meaningful use stage 1

*engaging the Implementation Workgroup to evaluate NIST test scripts and gather feedback from early pilots to reduce the certification burden

*incorporating vocabularies and code sets recommended by the Standards Committee, leveraging work done to date instead of reinventing what is already in progress

Doug's slides support all of these points, promising to involve the HIT Standards Committee in the S&I Framework to a  much greater extent than in the past.

We summarized the day with plans for the next meeting and a recap of charges to each workgroup

Whole Committee
  On March 29 we'll review the timeline and milestones along the path to the Certification and Standards NPRM for Stage 2 to develop a project plan for April to October
  On March 29, we'll do a final review of the Direct project based on live implementations

Quality Workgroup
  Name a new chair
  Provide eMeasures oversight
  Educate the entire Committee about the information models used to generate quality measures

Privacy/Security Workgroup
  Policy Committee's Certificate charge
  Policy Committee's Provider Directory charge

Clinical Operations Workgroup
   Device hearing
   Policy Committee's Patient matching/demographics code set charge
   Liaison to S&I Framework for Lab, Transfers of Care, and CDA Cleanup efforts

Implementation Workgroup
   Liaison to NIST for test script review and refinement

I look forward to the work ahead!

Friday 18 February 2011

Cool Technology of the Week

Last week, I wrote about the Direct Project and Patient Engagement.

This week, BIDMC implemented it.

While I was in Washington, co-chairing the HIT Standards Committee meeting, BIDMC engineers installed the open source Direct Gateway inside the BIDMC firewall.     They worked with Healthvault engineers to exchange certificates so that the digital signing and encryption aspects of Direct's S/MIME implementation would guarantee data security and integrity.

BIDMC engineers then sent my Continuity of Care Record and Continuity of Care Document via the Direct gateway to my secure Health email address - jhalamka@direct.healthvault.com

A few seconds later, I received a notification on my personal gmail account

"New health information is available

While you can view this document at any time, you can't work with the information in it until you add each item into the right place in John Halamka's HealthVault record. This optional step makes the information more easily available to you and any health tools you use.

See the information you can add to keep John Halamka's record up-to-date."

I clicked on the URL embedded in the email, logged into Healthvault, viewed the incoming CCR and CCD, then incorporated the records as structured data into my Healthvault account.

The resulting data is displayed in "data atomic form" as suggested by the PCAST report.   The screenshot above illustrates my data sent from BIDMC to Healthvault via Direct.

A one day implementation of an open source gateway that securely sends patient data (mine) to a PHR using industry proven S/MIME standards, creating a personally controlled health record that I can export and reuse per my privacy preferences.

That's cool!

Thursday 17 February 2011

The PCAST Hearing

On February 15 and 16, the HIT Policy Committee and the HIT Standards Committee convened to hear testimony about the President's Council of Advisors on Science and Technology HIT Report.

The hearings consisted of 5 panels.   Here are the major themes.

Panel 1 focused on Health Information Exchange and Healthcare Stakeholders

Carol Diamond, MD, Markle Foundation
J. Marc Overhage, MD, Indiana HIE Organization
Art Glasgow, Vice President & Chief Technology Officer, Ingenix

* Trust is more complex than consent and cannot be achieved by technology alone
* Data source systems are frequently not able to meet reasonable service levels for response time and data persistence
* Data source organizations need assistance (strategy, policy, when to attached standardized vocabularies to data) in normalization data
* Need to balance mobilizing data verses losing context

Panel 2 focused on Patients / Consumer / Privacy Advocates

Donna Cryer, JD, CryerHealth Patient-Centric Solutions
Deborah Peel, MD, Patient Privacy Rights
Joyce Dubow, AARP
Lee Tien, Senior Staff Attorney, Electronic Frontier Foundation

* Consent is essential but not sufficient.  PCAST's heavy reliance on consent to achieve adequate privacy is a concern
* First goal of data use should be for treatment of patients and not for secondary uses
* Privacy preferences must be dynamic based on segmentation of data
* Must do proof of concept of pilots of DEAS and privacy
* PHRs can play a role in patients' ability to express granular privacy preferences
* Concern about adequacy of de-identification
* Many privacy issues not discussed during panel

Panel 3 focused on Population Health
Richard Platt, MD, Harvard Medical School, Distributed Health DataNetwork
Joyce C. Niland, Ph.D., Associate Director & Chair of Information Sciences, City of Hope

* Population and clinical research require persistent record sets, curated for the anticipated use
- Observational data is best suited for hypothesis generation
- Correct interpretation requires participation of originator in interpretation, semantic standards will crease but not eliminate this dependence
- research data models reflect study design, not data characteristics
- PCAST does not preclude and can support distributed data management
* Population studies require identification of the population (denominator) and the intervention sub-population (numerator)
- Granular consent and opt out by data suppliers could be problematic
- Policies must support continued use of public health
* De-identification is problematic

Panel 4 focused on Providers and Hospitals

Sarah Chouinard, MD, Medical Director, Primary Care Systems, Inc. and Community Health Network of West Virginia
John E. Mattison, MD, Kaiser Permanente
Scott Whyte, Catholic Healthcare West, provider using middleware
Kevin Larsen, MD, CMIO, Hennepin County Hospital
Theresa Cullen, MD, Indian Health Service, HHS

*PCAST Timeline too aggressive to execute
*Privacy tags may hinder normal institutional use of data
*Propagation/redaction of inaccurate data is a concern
*Middleware may bridge legacy systems to PCAST vision but has limitations
*Patient matching is problematic
*Novel PHR use may additional spur HIT adoption

Panel 5 focused on Technology implications of the report
Michael Stearns, MD, CEO, e-MDs, Small EHR Vendor
Hans J. Buitendik, M.Sc., HS Standards & Regulations Manager, Siemens Healthcare
John Melski,  Marshfield Clinic, homegrown EHR
Edmund Billings, Medsphere

*It is important to maintain the context of a clinical encounter and to preserve the meaning when the data is reused for purposes other than as originally intended.
*Capture structured data with appropriate granularity and controlled terminology.   A "data atom" should be the amount of data that makes sense for the particular use intended.
*Separate the syntax (the container used to send data) from semantics (the ontologies and vocabularies).   Admittedly, in healthcare summary standards, syntax has been driven by semantics, so this separation would require careful thought.
-Syntax is the study of the principles and rules for constructing sentences in natural languages.
-Semantics is the study of meaning. It typically focuses on the relation between signifiers, such as words, phrases, signs and symbols, and what they stand for.
-Information models or relationship types provide frameworks to maintain context.  Explicit representation of context must be integrated into an evolving Universal Exchange Language and may require specification of an information model.
*Evaluate the burden and timeframe and priority in the context of existing meaningful use and ICD10/5010 projects.
*Simply exchanging data does not necessarily lead to useful and accurate data.  We need to know how the data was captured, for what purpose, and by whom.
*Open source solutions should be considered to drive low cost solutions for data exchange
*Use existing profiles/technologies and middleware to meet PCAST data exchange goals. We should not rip and replace existing applications and standards.

We discussed one strawman idea for incorporating PCAST ideas into Stage 2 and Stage 3 of Meaningful Use.

Given that the Direct project  provides a means to push data securely using secure email standards, require that EHRs push immunization data in an electronic envelope containing metadata (we'll call that envelope the universal exchange language) to state and local immunization repositories as part of Meaningful Use Stage 2.   This will implement the Universal Exchange Language portion of the PCAST report.

ONC should begin work on connecting state level immunization registries with a person identified index and privacy controls.     This will implement the  Data Element Access Services (DEAS)  portion of the PCAST report.  

The DEAS will require significant additional policy and technology work that will not be ready by Stage 2.   Thus, by Stage 3 require that EHRs be able to query a DEAS to retrieve immunization data at the point of care so that clinicians can deliver the right immunizations at the right time to the right patients based on a nationwide federated network of immunization exchange.  It's good for patients, good for clinicians, good for public health, and does not raise too many privacy concerns.   Of course we should pilot granular privacy controls enabling individuals to control the flow of immunization information per their privacy preferences.

We'll have several additional meetings before the final workgroup report is issued.    I believe we're close to achieving consensus on the major concerns and next steps as we offer ONC options for incorporating the spirit of PCAST into their work.

Wednesday 16 February 2011

Securing your iPad and iPhone4

I'm often asked how IT departments should advise users to secure their iPads and iPhone4's.

Here's the process suggested by my security team:

1. Make sure you're running the latest iOS version (4.2.1 currently)
2. Download "Find My iPhone" (free app) from the Apple App Store. Log in or set up a new Mobile Me account and add the iPad to be tracked. Also try it out from a desktop to make sure you can (as a test) send a message to the device.
3. Make sure the iPad autolocks, requires a long passcode and erases data after 10 failed passcode attempts.
    In Settings->General, configure:
    a. Auto-Lock: set to something short, like 2-5 minutes (NOT "Never")
    b. Passcode Lock:
        1. Turn Passcode On
        2. Require Passcode: Immediately
        3. Simple Passcode: Off (then set a long passcode)
        4. Picture Frame: Off
        5. Erase Data: On

If the iPad is stolen, was locked at the time, and the thief does not have unencrypted access to any other device that had previously synced with the iPad (a Mac/PC), the data can be considered "safe".   The user should use "Find My iPhone" to issue a remote wipe as soon as possible. This will of course work better over 3g, but should still be done if it's a wifi-only model.

They should also change any application or institutional passwords that may have been cached on their mobile device.

This will protect against likely attacks in the near-term. That is, someone finds your iPad, taps around looking for emails, pictures, etc, they can't get in. If they hook it up to a desktop, they won't be able to read anything on the filesystem.

This method should meet the standards of safe-harbor, as it includes encryption, "best practice" guidelines, and could be considered reasonable.

A few things to be aware of:

The certificates necessary to bypass the passcode screen are saved on your computer when you sync the iPad.

The hardware encryption used to protect the filesystem (and the passcode) are based on an encryption key known to Apple. They routinely unlock devices for law enforcement (with a court order).

Current accusation guidelines for forensic examiners state that the SIM card should be immediately removed and the device be placed in a Faraday bag to prevent remote wiping (iOS Forensic Analysis for iPhone, iPad and iPod touch, Sean Morrissey, Apress 2010, 978-1-4302-3342-8). Expect attackers do the same.

Cellebrite claims to be adding support for extracting encrypted, passcode locked images from iOS devices with their UFED Physical capture device. Details are a bit hazy on how they're actually accomplishing this, but expect others to follow suit once it's released. Expect hackers to take full advantage of this.

There are many background network based operations constantly running in iOS when the screen is off (and passcode locked). Assuming the device has not been remotely wiped it would be possible to observe these network connections and extract username/passwords. This shouldn't be a problem for most institutional credentials, which require network encryption for authentication, but an observer may be able to harvest passwords to other email or social networking services.

The cab driver who found your phone/ipad probably doesn't have the hardware, technical forensic knowledge or any ability to monetize extracted data. But the guy running the data mining operation buying from him in bulk probably does.

A better protection scheme would be something that applies encryption to stored data in user-space.    This is the realm of the Good Technologies product, MobileIron and others.

Tuesday 15 February 2011

PQRI XML Testing

Many folks have asked me about the process for testing PQRI XML files, since they are part of Certification and are required to be submitted to CMS in 2012 as part of Meaningful Use Stage 1.   The inspection for Certification is manual (the testing and certification bodies visually examine the files).   To my knowledge, there are not online tools available for self validation of these files (although it would be a great service to the country to create them).

The PQRI resources I do know about are part of the 4 step qualification process for those who want to serve as qualified PQRI registries on behalf of their participants.

Here is the testing process used by the Massachusetts eHealth Collaborative, which acts as the qualified 2011 Physician Quality Reporting System registry for BIDMC's ambulatory submissions.

The Iowa Foundation for Medical Care is the CMS contractor for PQRI testing.

The Massachusetts eHealth Collaborative (MAeHC) worked with the Iowa Foundation for Medical Care (IFMC) as follows:

1.  Self Nomination followed by a phone interview.
2.  Submission of measure calculation process/logic
3.  Submission of data validation strategy/plan
4.  Submission of sample XML

The fourth step required MAeHC to generate sample XML files based on the CMS specification, encrypt the files using CMS approved encryption software, and send them on a DVD-ROM via certified mail to IFMC.

Once this one done, MAeHC sought Individuals Authorized Access to the CMS Computer Services (IACS) accounts to login to the CMS quality data submission site.

In order to get IACS accounts, users must be identity proofed by IFMC/CMS. MAeHC had to submit a list of names with complete contact information of all users who would be authorized to submit registry data to CMS.  Each of them then had to apply for an account via the CMS Application portal and then waited a few days for it to be approved.

Once accounts were approved, each user had to login to the QualityNet portal to ensure the credentials had the proper level of access.  They then were required to submit another set of test files for validation using the online utility to ensure that they complied with any changes that were made in the specifications.

Here's a complete overview of all the CMS requirements for qualification as a PQRI registry.

That's the process.  I hope it helps those organizations seeking to serve as registries submitting PQRI data on behalf of participants to CMS.

Monday 14 February 2011

Detailed Clinical Models

As the PCAST Workgroup ponders the meaning of a Universal Exchange Language and Data Element Access Services (DEAS), it is exploring what it means to exchange data at the "atomic", "molecular", and document level.   See Wes Rishel's excellent blog defining these terms.   For a sample of my medical record using one definition of an atomic form, see this Microsoft Healthvault screenshot.   It's clear to me that if we want to exchange structured data at a level of granularity less than an inpatient/outpatient/ED encounter, we need to think about detailed clinical models to specify the atoms.  

As I've discussed previously, it would be great if EHRs and PHRs with different internal data models could use middleware to exchange a reasonably consistent representation of a concept like "allergy" over the wire.    I think of an allergy as a substance, a precise reaction description, an onset date, a level of certainty, and an observer (a clinician saw you have a severe reaction verses your mother thought you were itchy).    PHRs often have two fields - substance and a severe/minor indicator.    Any EHR/PHR data exchange without an agreed upon detailed clinical model will lose information.

HL7's Reference Information Model (RIM) provides one approach to modeling the data captured in healthcare workflows.  However, many clinicians do not find the RIM easily understandable, since it is an abstraction (Act, ActRelationship, Participation, Roles, Entities) rather than a reflection of the way a clinician thinks about clinical documentation.   Alternatively, a detailed clinical model provides an archetype or template to define the aspects of the data we should exchange.

Stan Huff at Intermountain Healthcare has led much of the work on detailed clinical models in the US.   Here's a recent presentation describing his work.

To illustrate the way a detailed clinical model can enhance data exchange, here's a description of an allergy template based on collaborative work between Intermountain, Mayo and GE.

The Australian work on OpenEHR is another interesting approach to detailed clinical models.  It creates a clear expression of a clinical concept in a manner that can be understood both by clinical subject matter experts and technologists. For a given concept, OpenEHR specifies a SNOMED code and then builds the appropriate information structure around it.

The screen shot above illustrates the OpenEHR archetype for adverse reactions/allergies.

Other important efforts include:

William Goossen's ISO 13972 work.  He's writing a review of 6 different approaches (see the comment on Keith Boone's blog) including HL7's CDA templates, OpenEHR archetypes, and Stan Huff's detailed clinical models.  Hopefully it will be published soon.

The UK National Health Service Connecting for Health Project's Logical Record Architecture.

The Tolven Open Source Project's Clinical Data Definitions.

It's clear to me that wide adoption of a Universal Exchange Language would be accelerated by detailed clinical models.     This is a topic to watch closely.

Friday 11 February 2011

Cool Technology of the Week

In New England, we all have snow on our minds.

While shoveling the 7 feet of snow we've had in Wellesley, I've thought there must be an easier way - why not a small portable jet engine to just blast your driveway clear?

Although not yet practical for domestic use, I found a cool technology used by our railroads in the northeast - the jet engine snow blower.

The Snow Jet is a snow blower made from the engine of a jet mounted on a railroad car.  The Metro-North railroad uses 6 of these (pictured above) to clear the snow on commuter rail lines around New York.

The start up of the engine requires finess to prevent excess fuel in the nozzle from erupting into a twenty foot flame. The cab controls include movement up/down and right/left.   I'm told that the engines make so much noise that snow clearing near residential areas is banned from late evening until the following morning.

There is a certain amount of danger in operating the jet engine powered snowblowers. Along with melting and blowing snow they will also blow the loose debris out of the tracks and there's nothing like flying steel and concrete to ruin your day.

A jet engine for clearing snow - that's cool!

Thursday 10 February 2011

Choosing a Great Cognac

Recently, while watching the Granada Television Sherlock Holmes series with my family, we concluded that the English of the Victorian era used brandy/cognac as a cure-all for a multitude of emotional and physical ailments.   This led me to ask - how do you choose a great brandy/cognac?

Today,  domestic brandy is not a trendy drink in the US.   Chances are that your local liquor store will have a wall of single malt scotch, interesting whiskeys, a few bottles of cognac and a single type of inexpensive brandy.    Cognacs have become increasingly popular because of their  association with rap music and rap singers.

What is Cognac?

It's a beverage made from distilling wine made in the Cognac region of France, 250 miles southwest of Paris.  Cognac starts as a dry, thin, acidic wine made from Ugni Blanc (also called Trebbiano) grapes.   It's double distilled in copper pot stills, then aged in oak, where the alcohol and water evaporate over time, concentrating the flavor and reducing the alcohol content from 70% to 40%.

Although you may hear the term Champagne Cognac, it has nothing to do with Champagne, a region 100 miles east of Paris.   "Champagne" derives from the Roman "Campania" meaning "plain".  The plains of Champagne and Cognac do have the same chalky soil.

There are 4 major producers of cognac - Courvoisier, Hennessy, Martell, and Rémy Martin, although several smaller boutique brands are available.

At bottling, cognacs are blended to incorporate the best flavors from various batches/ages,  cognac is graded based on the youngest components in the blend.

Very Special (VS) - at least 2 years in oak
Very Special Old Pale (VSOP) - at least 4 years in the cask
Extra Old (XO) - at least 6 years in the cask but as of 2016, this will change to 10 years.

The Grape growing region of Cognac is divided into 6  areas, the best of which are the Champagne appellations:
Grande Champagne
Petite Champagne
Borderies
Fins Bois
Bon Bois
Bois Ordinaires

I do not choose any beverage based on reputation or price, I seek value - quality divided by price.

My recommendation is focus on VS or VSOP, because XO is generally very expensive.   From my limited experience I believe the best major brand VS is Courvoisier and the best major brand VSOP is Remy Martin.

On a cold winter's night, curl up with a good Sherlock Holmes video or book and try a snifter of one yourself!

Wednesday 9 February 2011

The Direct Project and Patient Engagement

The proposed Stage 2 Meaningful Use Recommendations include numerous patient engagement features: patient communication preference, electronic self management tools, EHR interfaces to PHRs, patient reporting of care experiences online, and patient generated data incorporation into EHRs.

I've long felt that a barrier to patient engagement is the lack of common approach to transfer data between EHRs and PHRs as well as to send reminders/alerts/communications to patients.

Patients lack a Health URL or Health Email Address which would enable any EHR or HIE to route data securely among providers and patients.

There's a solution in sight, enabled by the Direct project.

Last week, Microsoft announced that it will provide a health email addresses (your_name@direct.healthvault.com) to every user of Healthvault. Also, they've provided an innovative way to sign up users who do not yet have a Healthvault account - just send an email to newuser@direct.healthvault.com with a subject line containing the patient's existing email account. The patient will be sent instructions to set up an account and receive their secure health message.

All of this uses the Direct S/MIME secure email approach for transport.

If Google, Dossia, and other PHR vendors support a similar Direct approach, then all we need to do to support the patient engagement aspects of Meaningful Use Stage 2 is capture each patient's secure health email address at registration or capture their regular email address and send an enrollment message to the PHR of their choice.

Instead of proprietary software development for every PHR, the Direct approach creates a single one time implementation for hospitals and EHR vendors.

Sean Nolan at Microsoft and have been exchanging email about the implementation details. Below, he outlines the details and the options

"1. For sending the message:

a. If you have an existing product that supports S/MIME, feel free to use it as long as it can encrypt AND sign outbound messages.  (BIDMC uses a Proofpoint appliance for email security management and it may support Direct S/MIME requirements out of the box.)

b. You can also generate the S/MIME message outside of the email system and then submit it as any other message to your existing Exchange server for delivery. You could use something like the smime utility that comes with openssl, or there are commercial components such as IP*Works S/MIME. This avoids any changes to your infrastructure and concentrates the work in the code that generates the outbound message.

c. You can install an instance of the C# or Java gateways that have been created as part of the Direct project. For outbound messaging, your message generating code could send plain-vanilla SMTP to the gateway, and it could do the sign/encrypt and forward it through your existing email system.

2. For managing certificates:

Two sides to this … your certificate (for signing the message) and ours (for encrypting it).

For encryption --- we can simply give you the HealthVault organizational public certificate to use. If you go with 1C, you can install this in the gateway software. For 1A or 1B you’ll use different approaches to storing it.

For signatures --- we’ll need a copy of your organizational public certificate, and then you’ll need to sign outbound messages with the private key. Again, for 1C above you can just add your private and public keys to the gateway; for 1A and 1B you’ll manage differently.

3. Testing:

You can self-provision HealthVault test accounts and Direct addresses here, which connects to our “pre-production environment” where all of our developers build and test code. The Healthvault staging certificates can be downloaded from here."


If Direct truly creates a single mechanism for healthcare stakeholders to exchange content - summaries, reminders, homecare device data etc, then we'll finally get enough endpoints connected to demonstrate the value of HIE. With Meaningful Use Stage 2 as a motivator and HIE funding as a catalyst, let's hope the country can converge on a common transport approach.

Tuesday 8 February 2011

A Multi-Layered Defense for Web Applications

The internet can be a swamp of hackers, crackers, and hucksters attacking your systems for fun, profit and fraud.  Defending your data and applications against this onslaught is a cold war, requiring constant escalation of new techniques against an ever increasing offense.

Clinicians are mobile people.  They work in ambulatory offices, hospitals, skilled nursing facilities, on the road, and at home.   They have desktops, laptops, tablets, iPhones and iPads.  Ideally their applications should run everywhere on everything.   That's the reason we've embraced the web for all our built and bought applications.   Protecting these web applications from the evils of the internet is a challenge.

Five years ago all of our externally facing web sites were housed within the data center and made available via network address translation (NAT)  through an opening in the firewall.   We performed periodic penetration testing of our sites.  Two years ago, we installed a Web Application Firewall (WAF) and proxy system.    We are now in the process of migrating all of our web applications from NAT/firewall accessibility to WAF/Proxy accessibility.

We have a few hundred externally facing web sites.  From a security view there are only two types, those that provide access to protected health information content and those that do not.   Fortunately more are in the latter than the former.

One of the major motivations for creating a multi-layered defense was the realization that many vendor products are vulnerable and even when problems are identified, vendors can be slow to correct defects.   We need  "zero day protection" to secure purchased applications against evolving threats.

Technologies to include in a multi-layered defense include:

1.  Filter out basic network probes at the border router such as traffic on unused ports

2.  Use Intrusion Prevention Systems (IPS)  to block common attacks such as SQL Injection and cross site scripting. We block over 10,000 such attacks per day.   You could implement multiple IPSs from different vendors to create a suite of features including URL filtering  which prevent internal users from accessing known malware sites.

3.  A classic firewall and Demilitarized Zone (DMZ)  to limit the "attack surface".

Policies and procedures are an important aspect of maintaining a secure environment.   When a request is made to host a new application, we start with a Nessus vulnerability scan.

Applications must pass the scan before we will consider hosting them.   We built a simple online request form for these requests for access to both track the requests and keep the data a SQL data base.    This provides the data source for an automated re-scan of each system.

Penetration testing of internally written applications is a bit more valuable because they are easier to update/correct based on the findings of penetration tests.

One caveat.   The quality of penetration testing is highly variable.    When we hire firms to attack our applications, we often get a report filled with theoretical risks that are not especially helpful i.e. if your web server was accidentally configured to accept HTTP connections instead of forced HTTPS connections, the application would be vulnerable.   That's true and if a meteor struck our data center, we would have many challenges on our hands.  When choosing a penetration testing vendor, aim for one that can put their findings in a real world context.

Thus, our mitigation strategy is to apply deep wire based security, utilize many tools including IPS, traditional firewalls, WAF and proxy servers, and perform periodic re-occurring internal scans of all systems that are available externally to our network.

Of course, all of this takes a team of trained professionals.

I hope this is helpful for your own security planning.

Monday 7 February 2011

Volatility, Uncertainty, Complexity and Ambiguity

In the era of healthcare reform when accountable care organizations, global payments, and partial capitation are the buzzwords filling Board rooms, healthcare executives are wondering what to do next.

The answer came from Dr. Gene Lindsey, President and CEO of Atrius Health during a recent retreat.

It's about accepting and managing VUCA.

V = Volatility. The nature and dynamics of change, and the nature and speed of change forces and change catalysts.
U = Uncertainty. The lack of predictability, the prospects for surprise, and the sense of awareness and understanding of issues and events.
C = Complexity. The multiplex of forces, the confounding of issues and the chaos and confusion that surround an organization.
A = Ambiguity. The haziness of reality, the potential for misreads, and the mixed meanings of conditions; cause-and-effect confusion.

The common usage of the term VUCA began in the military in the late 1990s, but it's been applied to corporate and non-profit leadership by several authors, especially Bob Johansen, former CEO of the Institute of the The Future.

I recommend two books by Johansen -  Get There Early  and Leaders Make the Future.

Johansen suggests that strong leaders turn volatility into vision, uncertainty into understanding, complexity into clarity, and ambiguity into agility.

He concludes that
1. VUCA will get worse in future.
2. VUCA creates both risk and opportunity.
3. Leaders must learn new skills in order to create the future.

Dr. Lindsey and I discussed these ideas and he added two of his own.

4.  Leaders need to turn ambiguity into action.  How many times have you heard "I do not have enough data to make a fully informed decision".   Not acting makes you a target in a VUCA world.

5.  Johansen notes that the most difficult VUCA competency for the future is "commons building".  Dr. Lindsey related this to Don Berwick's concept of the medical commons.  Berwick, when he was CEO of IHI, wrote about the need for a medical commons to accelerate the Triple Aim in healthcare.  He wrote, "Rational common interests and rational individual interests are in conflict. Our failure as a nation to pursue the Triple Aim meets the criteria for what Garrett Harden called a 'tragedy of the commons.' As in all tragedies of the commons, the great task in policy is not to claim that stake- holders are acting irrationally, but rather to change what is rational for them to do. The stakes are high. Indeed, the Holy Grail of universal coverage in the United States may remain out of reach unless, through rational collective action overriding some individual self-interest, we can reduce per capita costs."

Let's explore the issue of "commons building" with a healthcare IT example.  15% of the lab and radiology tests done in Eastern Massachusetts are redundant or unnecessary.  Ensuring all test results are available electronically among all providers (especially between competing organizations) will cost millions in EHR, HIE, and interface implementation. Thus, we'll have to spend money to reduce all our incomes.   It's the right thing to do, but the medical IT commons will be at odds with individual incentives in a fee for service world.   The right answer - change the incentives and pay individuals for care coordination, not ordering more tests.

I've thought about Dr. Lindsey's comments and realized that I've had my own VUCA challenges in the past as well many VUCA challenges in the present.

Let's turn back the clock to 2008.  The Obama campaign suggested that EHRs and HIEs were the right thing to do.   We had all the signs that ARRA and HITECH would be coming, but large scale EHR rollouts require significant lead time.  We had to act.   BIDMC decided that Software as a Service (Saas) EHRs were the right thing to do and created a Private Cloud.   The concept of the Private Cloud really did not existing in 2008 and we did not know enough to predict it.  We just did what we thought was right - keep all software and data on the server side rather than in the doctor's office.   Today, people look at our Community EHR SaaS model and congratulate us on our foresight to build a cloud.   I'll be honest - it was not planned or forecasted.   We just had intuition based on the market forces and technology trajectory we saw and we guessed.   I would really like to say we built a private cloud on purpose.   It was a serendipitous guess.   In the future, there may be cloud providers that offer business associate agreements for high reliability, cost effective, secure EHR hosting.   We should think about migrating our private cloud to such services in the future.

Also, 3 years ago, BIDMC decided to focus our Clinical Systems efforts on CPOE, Medication reconciliation, HIE, Quality measurement, and advanced Ambulatory function instead of inpatient clinical documentation or nursing workflow.   Meaningful Use Stage 1 was a perfect reflection of what we did.   I have no influence on the Policy Committee's focus nor did we have amazing insight.   It was a best guess.   Stage 2 is likely to include electronic medication administration records/bedside medication verification, enhanced vital signs capture, and more clinical documentation to provide data for quality measures.   We'll want to focus our future efforts there.

ICD-10 is required by 2013, new payment models based on quality and care coordination with incentives to share savings will begin in 2012, and pressure to reduce cost via guidelines/protocols/care plans will increase.    Our governance committees will have to make hard choices about what not to do in the VUCA world of the next 3 years.   Maybe the future is going to include more ambulatory and ICU care with ward care moved to home care.   We'll have to guess again where the puck is going to be.

As a leader, my time needs to divided among Federal, State, and Local initiatives so that my governance committees, my staff, and I can make the guesses for the future.   None of us know what healthcare reform will bring or what the reimbursement models will really be.   However, we need to act now to be ready for the next two years.   That's VUCA.

On occasion I tell my wife that someday the VUCA I face every day will get better.   She reminds me that it will only get worse.  If I'm doing my job properly, I will accept and manage the VUCA, so that my staff can focus on the work we need to do to stay on the cutting edge.

Friday 4 February 2011

Cool Technology of the Week

I'm a late adopter when it comes to camcorder technology.   Over the course of my life, I've see VHS, 8mm, Hi-8, and MiniDV camcorders revolutionize the market, only to be replaced by completely solid state storage (SSD, Compact Flash, iPod, etc).

Since I've only purchased 2 camcorders in my life, I've been insulated from all this change.   I had an 8mm simple camcorder (a water resistant "Sport" model) from the late 1980's and a MiniDV from the early 2000's.

My daughter lives in the Facebook generation and she recently asked me about the best way to connect our MiniDV camcorder to her Macbook, iMovie and YouTube.  The answer is - time to retire our tape based camcorders and purchase and under $200 a "Shoot and Share Camcorder".

A quick survey of the market suggests that Cisco's Flip UltraHD/MinoHD, Kodak's Playsport/Playtouch/Zi8, and Sony's Bloggie are the market leaders.

Each of these is pretty cool - 720p or 1080p HD video in a smartphone sized package under 5 ounces for $100-$200.

After reading the reviews,  I got the sense that the Sony is the most full featured, the Kodak takes still photos, is waterproof and outdoors-friendly, and the Flip is the easiest to use.

After briefly looking at these at Best Buy, my sense was the easiest to use and support was my chief requirement, so I did further research into the Flip.    A few of my colleagues in Corporate Communications/Public Affairs use Flips to create web content and seemed happy with their functionality, video quality and value.

The UltraHD 8G (avoid the 4G which lacks many features) and the MinoHD 8G have the same features and performance.  The only differences are that the UltraHD is a bit larger with a removable battery and tactile buttons while the MinoHD is a bit smaller with an internal battery and capacitance (touchable) buttons.

I found the user interface of the UltraHD more intuitive and liked the idea of removable/replaceable batteries.

When you purchase a Flip Ultra, you'll likely want to purchase the power supply to charge the battery when the Flip is not connected to the computer and a micro HDMI cable  (note this is not the standard Flip HDMI cable which is a mini-HDMI)

A pocket sized, easy to use, less than $200 camcorder with excellent video quality that works well with modern video editing tools and online video posting sites.   That's cool!

Thursday 3 February 2011

Shoveling Snow

I moved to New England from California 15 years ago and have been shoveling snow from December to March ever since.   Managing snow, especially this winter with over 5 feet of snow fall in Wellesley, is more complex than it seems.

There are many different kinds of snow  - light powder, wet/heavy gluelike snow, wintry mix of rain/sleet/ice, the "crud" left by snowplows, and the corn snow leftover from freeze/thaw cycles.

I'd like to say that New Englanders have even more words for snow than the Eskimos, but the whole notion that any language has a vast number of words for snow is an urban myth.

Here's my recommendation for the equipment you need to shovel snow in New England

1.  An Ergonomic snow pusher .   When snow first falls, you need a shovel to push it down the driveway.   I typically cut a path to the street and then use the pusher to move snow to the sides of the driveway.  Then, I can use my scoop shovel to move it into piles.

2.  A Scoop Shovel .   A pusher is great for moving snow but not so good for picking up chucks or larger amounts of consolidated snow to create piles (actually, it looks more like a canyon at this point in Wellesley) next to the driveway.   Scoop shovels were originally invented to move grain, but they work perfectly for snow.     I highly recommend Aluminum scoops because the poly shovels bend and break.

3.  An Ice Chipper/scraper  - Freeze/thaw cycles create a consolidated mixture of ice and snow that's as hard as concrete.   An Ice Chipper is create to break up the chunks as well as scape the ice/snow that sticks to asphalt and creates a hazard.   Of course you can salt your driveway to soften the ice before chipping it.

4.  A Spading Fork - Although it seems like an odd tool to use for snow management, a Spading Fork helps break up the large frozen piles of snow, ice, sand, and salt that the snowplows leave in your driveway.   I use a spading fork to turn the snowplow mound into smaller, manageable pieces, then the scoop shovel to move them to the piles.

5. A Sno-Broom -  Using a shovel, a chipper or a fork on your car is a really bad idea.  A Sno-broom enables you push ice and snow off your car without scratching the finish.    An ice scraper for the windows is also a good idea.

Some of you may be thinking that a snow blower or thrower would be a better idea.     But where's the fun in that!

Wednesday 2 February 2011

USB Modems

I was recently asked about the reliability of 4G USB Modems such as the LG VL600  or the Pantech UML290 as a replacement for Broadband, especially in areas that have intermittent cable connections or slow DSL.

I asked my staff about their experience with USB Modems in general and here is what they said:

"I have had very good luck with the Verizon 3G service.     The coverage and performance has been good but my experience is limited to the north of the city. I have not tested the 4G service yet.    I do know that all of the towers in the greater Boston area received all new equipment within the last 6 months to support the 4G and they upgraded the land line circuits to support increased demand by a minimum of 3 fold and more in some locations.    It turns out it was the schedule of the land line upgrades the controlled the 4G roll out schedule."

"I use the Verizon 3G service for remote connectivity. I have had two of the USB versions of the device and moved to the MiFi version about 9 months ago. I use it daily to work on the train and other locations for at least 2 hours a day.

I find it very useable but there are areas of low/no signal. The device works via USB or wireless and lets up to 5 users/devices connect via the 3G.

I find it most useful for heavy Internet use and remote access to my work desktop. It is not very useful for video viewing unless you cache and then play them.  It is fine for email, Internet searching and working on / moving office files.

4G is available in the Boston area but I will need to get another device to take advantage of this upgrade.  The 4G performance should be better (at least twice as fast at this time). Pricing on 4G seems the same as 3G and the USB modems  will switch to 3G if/when 4G is not available. "

There you have it - 3G and 4G USB Modems and MiFi are credible alternatives for some uses when Broadband is not available.   One of our clinician sites is in a location with very poor access to any ISP.   We are now investigating the possibility of using 4G as the internet connection for that office as part of implementing our software as a service electronic health record.

Tuesday 1 February 2011

The Safety of HIT-Assisted Care

I was recently asked by an Institute of Medicine committee to comment about the impact of healthcare information technologies (HIT) on patient safety and how to maximize the safety of HIT-assisted care.

"HIT-assisted care" means health care and services that incorporate and take advantage of health information technologies and health information exchange for the purpose of improving the processes and outcomes of health care services. HIT-assisted care includes care supported by and involving: EHRs, clinical decision support, computerized provider order entry, health information exchange, patient engagement technologies, and other health information technology used in clinical care.

There are two separate questions:
1. What technologies, properly used, improve safety?
2. Given that automation can introduce new types of errors, what can be done to ensure that HIT itself is safe?

To explore these topics, let's take a look at Health Information Exchange (HIE).  What HIE technologies improve safety and how can we ensure the technologies are safe to use?

At Beth Israel Deaconess Medical Center we exchange many types of data for care coordination, patient engagement, and population health.   Below is a detailed summary of the HIE transactions implemented in our recently certified hospital systems.

Most of these transactions are sent via the New England Healthcare Exchange Network (NEHEN)  using AES256 and SHA-1 to encrypt and hash data, ensuring the privacy and integrity of information shared among payers, providers, patients, and government in Massachusetts.

1.  Patient Summary Exchange for Transitions of Care
We produce a Continuity of Care Document for each patient handoff i.e. from inpatient, ED and outpatient (coming soon)  to home, skilled nursing facilities, or other hospitals.   The CCD includes

Problems
Procedures
Medications
Allergies and Adverse Reactions
Results
Encounters

Safety is improved by ensuring each provider has a complete problem list, medication list, allergy list, and recent results.    Such a document is useful for medication reconciliation, drug/drug and drug/allergy decision support, and managing the entire patient by understanding all active problems.

However, summaries exchanged at a point in time are just that - a summary or abstract of the lifetime record that is accurate at a point in time.   They do not provide access to the complete record such as inpatient notes, operative notes, history and physicals, and historical data such as discontinued medications or resolved problems.   Many clinicians believe that a patient summary at a point in time is good enough for transitions of care, so the risk introduced by abstracting the record into just the salient handoff details may be minimal.    A compromise may be a fresh look at what elements should be required for transitions of care.   Last week, Massachusetts was awarded an ONC challenge grant to study this question by piloting innovative additions to the standard CCD using CDA Templates.

Here's a CCD for transitions of care, displayed in human readable form via a stylesheet.

2.  Patient Summary Exchange from EHRs to PHRs
We produce a Continuity of Care Record when a patient initiates a transfer of their records from our EHR to the PHR of their choice (Google or HealthVault).   The CCR includes

Demographics
Problems
Medications
Allergies
Additional Information About People and Organizations

Safety is improved by sharing data between providers and patients, making the patient the steward of their own records.   This transparency encourages a dialog about treatment plans, patient care preferences, and the accuracy of data in the medical record.

However, most commercial Personal Health Records do not provide for exchange of clinician office notes such as we've piloted in BIDMC's Patientsite OpenNotes Project, nor do they include a consistent way to map EHR data to PHR displays.  For example, BIDMC's EHR considers an allergy list entry to be the substance, the reaction, the observer (doctor, nurses, your mom), and the level of certainty.   Google  considers an allergy to be the substance and a mild/severe indictor.  Thus, a transmission of an allergy "Penicillin, Hives, Doctor, Very Certain"  to Google results in "Penicillin" with no other information.    Use of an agreed upon list of data elements (i.e. what constitutes an allergy list) for data exchange would resolve this problem.

Here's a CCR transmitted from an EHR to a PHR, displayed in human readable form via a stylesheet.

3.  Patient Summary Exchange for Discharge Instructions
We produce a Continuity of Care Document with discharge instructions for patients via a multidisciplinary web application used by doctors, nurses, social workers, and case managers.  The CCD includes

Discharge Medications
Discharge Instructions
Final Diagnosis
Recommended Follow-up
Major Surgical or invasive procedures
Condition at discharge

Safety is improved by ensuring the patient understands the next steps after they are discharged from the hospital.   Inpatient medications are reconciled with outpatient medications, dietary or activity restrictions are noted, and followup appointments are documented.

However, at present, Meaningful Use does not require a specific electronic format for patient discharge communications.   Patient discharge instructions are generated by humans and include a distillation of the record, not a complete copy of the record.  A printed report, a PDF, or a web page all suffice.  Although we have used the Continuity of Care Document format, it is not optimized for structured discharge instructions.   Likely CDA Templates with specific fields for the data elements most commonly used in discharge communications would be better.

Here's a CCD for patient discharge instructions, displayed in human readable form via a stylesheet.

4.  Patient Summary Exchange for Quality Measurement
We produce a Continuity of Care Document with key process and outcomes measures for transmission to the Massachusetts eHealth Collaborative (MAeHC)  Quality Data Warehouse.   MAeHC computes all our ambulatory PQRI measures and all our pay for performance metrics.  The CCD includes

Payers
Problems
Procedures
Results
Medications
Encounters

Safety is improved by providing our clinicians, administrators, and government agencies with the metrics needed to evaluate our process and outcomes quality.

However, quality measures require precise coding of concepts into SNOMED-CT and other vocabularies.   It is up to the clinician to translate their observations into the correct structured data and this is challenging.   Better tools to automatically map physician plain language into controlled vocabularies would help.

Here's a CCD for quality measurement, displayed in human readable form via a stylesheet.

5.  Patient Data Exchange for Public Health Activities
We produce numerous submissions to government agencies to support population health and public health goals.   The messages are sent to public health in batch every day based on results filed into patient records.   They are exact duplicates of patient results, diagnoses, and immunization records  without any loss of completeness.

Reportable lab results are sent to the Department of Public Health and Boston Public Health Commission.   Here's the HL7 2.5.1 for labs.

Syndromic Surveillance is sent to the Department of Public Health and Boston Public Health Commission.   Here's the HL7 2.5.1 for surveillance.

Immunizations are sent Department of Public Health and Boston Public Health Commission.   Here's the HL7 2.5.1 for immunizations.

Safety is improved through monitoring of results, symptoms, and immunizations in support of public health.

However, syndromic surveillance is limited by the accuracy of the structured signs and symptoms data captured by EHRs.   Ensuring that clinician observations are captured in an accurate, structured and timely way, then transmitted to public health requires more advanced vocabulary tools than exist in many EHRs.

Summarizing my observations:
1.  Summary data is an abstract captured at a moment in time.   Data corrections/updates are not sent.   Thus, data about the patient becomes incomplete and stale over time.  However, for the purpose intended, ensuring a transition of care is safe, a point in time summary may be good enough.

2.  Enhanced vocabulary tools that translate clinician observations into structured data (such as Kaiser's recent contribution of its intellectual property) are useful to convey the meaning of information exchanged.

3.  Implementation guides that specify required data elements are important so that receivers can accurately display the information exchanged.

4.  Testing approaches already used as part of the certification process validate that data in the EHR is exported into interoperable formats accurately.    NIST tools ensure that interoperable formats are compliant with standards.    The challenge is getting the data into structured electronic form to begin with and deciding what to exchange for a given purpose

5.  Although not specifically discussed above, patient identification can be a challenge given the lack of a national patient identifier or an agreed upon way to link the same patient's data among multiple institutions.   The combination of labs, medications, and summaries from multiple sources might indicate a safety issue. Having a consistent approach to link these records would be helpful.

A number of these issues are part of the PCAST Workgroup discussion - should data be sent in context rich documents or separated into individual "atomic" data elements?  How granular is an atom - is it a problem list, a single problem, or a single field within a single problem (i.e. problem onset date)?  How should patient matching be done?  How should searching be done?   Should data be structured and vocabulary controlled or unstructured?

The IOM, ONC, and the PCAST efforts are raising all the right issues.   I believe the Standards and Certification criteria for Stage 1, exemplified by all the standards samples documented above, is moving the country on the right trajectory to enhance the safety of care while ensuring HIT-assisted care is safe.

Girls Generation - Korean