Details about Healthcare Administration Degree Programs and ten of the best schools that offer this degree online, including tuition costs and unique features.
Friday 27 May 2011
Off to the Highlands
I'm in Scotland for a few days lecturing at the University of Dundee near Edinburgh.
My colleagues in Scotland and I have much to discuss about healthcare technology and policy. I'll summarize my lessons learned next week.
Tomorrow a small group of us will climb some of the highest peaks in the UK - Aoneach Eagach and Ben Nevis. They're known for their uniquely bad weather (171 inches of rain per year), high winds, and challenging trails.
I've packed my total body Gortex and my lightweight Treksta Gortex boots. Here's a complete list of my gear for the Scottish Highlands
Boots
Treksta Evoltion Mid (Gortex)
Socks
Injinji Crew
Outdoor Research Verglas Gaiters (Gortex)
Pants
Arcteryx Rho LT
Arcteryx Alpha SL (Gortex)
Shirts
Arcteryx Phase SL
Arcteryx Phase AR
Jackets
Arcteryx Celeris windshell
Arcteryx Alpha LT (Gortex)
Arcteryx Solo Belay Jacket
Gloves
Outdoor Research PL400
Outdoor Research Endeavor Mitts (Gortex)
Headwear
Outdoor Research Option Balaclava
Outdoor Research Winter trek hat
Outdoor Research Drifter cap (Gortex)
Other gear
Petzl e+lite
Small First aid kit
Black Diamond Shot climbing pack
Platypus 2L water reservoir
Pro-Bars (they're vegan)
Prescription polarized sunglasses
I'm off to the Highlands and will post pictures of the summits.
Thursday 26 May 2011
Building Birdhouses
When my wife and I created our community garden we installed birdfeeders (sunflower and thistle) and birdhouses to support the birds that nest in our area. I built the bird houses from a single length of 1x6 cedar from Home Depot using these simple plans. Just 5 cuts with a Japanese handsaw, a few finishing nails and they were ready for mounting. Birds moved in within hours.
Here's a photo of a cedar bluebird birdhouse, ready to be installed.
Here's another great site with house blueprints for several bird species.
My next project is a tree-mounted house for
Black-capped Chickadees
Carolina Chickadees
Mountain Chickadees
Chestnut-backed Chickadees
Boreal Chickadees
Siberian Chickadees
House Wrens
Carolina Wrens
Bewick's Wrens
Winter Wrens
Prothonotary Warblers
Tree Swallows
Violet-green Swallows
Tufted Titmouse
Plain Titmouse
White-breasted Nuthatch
Red-breasted Nuthatch
Brown-headed Nuthatch
Pygmy Nuthatch
Brown Creeper
I highly recommend building cedar birdhouses with hand tools. It's great therapy.
Here's a photo of a cedar bluebird birdhouse, ready to be installed.
Here's another great site with house blueprints for several bird species.
My next project is a tree-mounted house for
Black-capped Chickadees
Carolina Chickadees
Mountain Chickadees
Chestnut-backed Chickadees
Boreal Chickadees
Siberian Chickadees
House Wrens
Carolina Wrens
Bewick's Wrens
Winter Wrens
Prothonotary Warblers
Tree Swallows
Violet-green Swallows
Tufted Titmouse
Plain Titmouse
White-breasted Nuthatch
Red-breasted Nuthatch
Brown-headed Nuthatch
Pygmy Nuthatch
Brown Creeper
I highly recommend building cedar birdhouses with hand tools. It's great therapy.
Wednesday 25 May 2011
Meaningful Use Payments Arrive
HITECH Incentive payments began on May 19, four weeks after attestation for Meaningful Use became available.
On May 19, BIDMC received the following electronic funds transfer from CMS/Medicare:
CORPORATE TRADE PAYMENT CREDIT
CMS (EHR INCENT) DES:HITECH PMT ID:
BIDMC was the first hospital in the country to attest to meaningful use and received payment from CMS on the first day of stimulus disbursements. Hospital payments start with a $2 million base payment. Per the CMS FAQ
"Eligible hospitals and CAHs will receive an initial payment and a final payment. Eligible hospitals and Critical Access Hospitals that attest in April can receive their initial payment as early as May 2011. Final payment will be determined at the time of settling the hospital cost report."
Although we received an initial $2 million dollar payment, we have not received information about the final payment calculation or timing.
Some have worried that attesting early will create a timeline for stage 2 that is challenging to meet i.e.
Standards Committee work for stage 2 will be done by September 2011
ONC proposed regulations will be drafted in the Fall, released in December and will become final in mid 2012
The Stage 2 meaningful use demonstration period begins October 1, 2012
The likelihood that regulations can be transformed into working, implemented software by October 1, 2012 is slim.
Hence the HIT Policy Committee will likely recommend that Meaningful Use Stage 2 be deferred a year, meaning that the demonstration period for those who attest to stage 1 in 2011 will be moved to October 1, 2013.
Based on everything I know, here's the workplan I'd recommend for IT departments
1. In 2011, update your purchased products as needed to implement meaningful use versions
2. In 2011, if your systems are built rather than bought (or are a combination of the two), use the CCHIT EACH program to certify your site as needed for hospital and ambulatory certification criteria.
3. In 2011, educate your clinicians and measure meaningful use metrics for a 90 day demonstration period. Note that this can be done in parallel with certification, since systems only need to be certified by the end of the demonstration period
4. In 2011, collect your initial meaningful use payments
5. In 2011, work on X12 5010 for the January 1, 2012 deadline
6. In 2011, begin ICD10 planning for the October 1, 2013 deadline. I believe this deadline will be extended.
7. In 2012, plan on beginning your Meaningful Use stage 2 measurement period on October 1, 2013.
A lot going on in parallel, but by taking it one day at time, step by step, it's doable.
Tuesday 24 May 2011
A Strawman HIE Directory Solution
At the May HIT Standards Committee, we discussed the standards which support entity-level (organization) provider directories (ELPDs) in healthcare information exchanges.
The business requirements suggested by the HIT Policy Committee's work (the table below) require federated query/response transactions to a single, nationwide enterprise level provider directory, specifically
1) Support directed exchanges (send/receive as well as query/retrieve)
2) Provide basic “discoverability” of entity – including demographic information
3) Provide basic “discoverability” of exchange services (e.g., CCD, HL7 2.xx)
4) Provide basic “discoverability” of entity’s security credentials
When we presented the currently available standards - DSML for the schema, LDAP/ISO for the query vocabulary, REST/SOAP for the transport, and LDAP for the query language, the response from the HIT Standards Committee was that the combination of these standards as specified in the IHE HPD profile was largely untested in production.
Our conclusion was to revisit the business requirements with the HIT Policy Committee with the hope that we could devise a workflow enabling existing, mature standards, such as DNS, to be used for provider directories.
The presentation by the Privacy and Security Workgroup included this summary of how the existing NwHIN exchanges – Direct and Exchange – provide the required services.
One possible avenue for moving forward might be to build upon the Direct Project’s work to enable the Domain Name Service (DNS) to be used as the federated service for discovering entities and their security credentials. I recently learned about an idea that Paul Egerman has suggested to the ONC: the possibility of creating a top-level domain for the health industry. Putting those two ideas together,
here is a strawman that would move us forward.
1. The ELPD should be a single, national data structure that is accessible by EHR systems. Accessibility needs to include the capability to have a local cache.
2. A national ELPD could be achieved through the use of a top-level domain for the health industry (e.g., .HEALTH), instead of GOV, EDU, COM, MIL, ORG, and NET to designate entities participating in healthcare information exchange.
With a .HEALTH top-level domain there could be a defined set of registrars who are authorized to issue .HEALTH domain names. The benefits of doing this include:
Financial - The business model for registrars is already established, while there is no business model for other approaches being explored.
Leverages Existing Software Capabilities - The software for registering entities and making updates for domain names is well established. The use of DNS is well-known and can easily handle a national entity directory. DNS (along with "WhoIs") can be used by EHR systems.
Security - We could restrict query of the .HEALTH domain to other members of the .HEALTH domain, reducing its vulnerability to denial of service attacks and spamming.
3. The ELPD would embrace the Direct Project's implementation guide for storing digital certificates in, and retrieving digital certificates from, DNS.
As for the HIT Policy Committee’s request for standards supporting the discovery of demographic information and exchange capabilities, that functionality could be achieved using a decentralized approach. For example, the Standards Committee could specify that each organization needs to have a Uniform Resource Identifier (URI) where they list additional information about their organization, including their health information exchange send and receive capabilities (e.g. http://www.bidmc.HEALTH/services). Such an approach would be easy to maintain and would be extensible.
Thus, rather than try to invent new standards, processes, and business models, let's leverage the basic standards of the internet -a top-level domain, DNS, and URIs to create the Directory Services we need to enable Health Information Exchange.
As a next step, the Privacy and Security Workgroup will consider the possibilities of this strawman.
Based on the guiding principles for the HIT Standards Committee articulated in the first meetings of the committee - keep it simple, do not let perfection be the enemy of the good, design for the little guy, leverage the internet, and keep the burden/cost of implementation low, I'm convinced the notion of using a top-level domain, existing DNS standards and URIs to support health information exchange directories is worthy of serious consideration.
The business requirements suggested by the HIT Policy Committee's work (the table below) require federated query/response transactions to a single, nationwide enterprise level provider directory, specifically
1) Support directed exchanges (send/receive as well as query/retrieve)
2) Provide basic “discoverability” of entity – including demographic information
3) Provide basic “discoverability” of exchange services (e.g., CCD, HL7 2.xx)
4) Provide basic “discoverability” of entity’s security credentials
When we presented the currently available standards - DSML for the schema, LDAP/ISO for the query vocabulary, REST/SOAP for the transport, and LDAP for the query language, the response from the HIT Standards Committee was that the combination of these standards as specified in the IHE HPD profile was largely untested in production.
Our conclusion was to revisit the business requirements with the HIT Policy Committee with the hope that we could devise a workflow enabling existing, mature standards, such as DNS, to be used for provider directories.
The presentation by the Privacy and Security Workgroup included this summary of how the existing NwHIN exchanges – Direct and Exchange – provide the required services.
One possible avenue for moving forward might be to build upon the Direct Project’s work to enable the Domain Name Service (DNS) to be used as the federated service for discovering entities and their security credentials. I recently learned about an idea that Paul Egerman has suggested to the ONC: the possibility of creating a top-level domain for the health industry. Putting those two ideas together,
here is a strawman that would move us forward.
1. The ELPD should be a single, national data structure that is accessible by EHR systems. Accessibility needs to include the capability to have a local cache.
2. A national ELPD could be achieved through the use of a top-level domain for the health industry (e.g., .HEALTH), instead of GOV, EDU, COM, MIL, ORG, and NET to designate entities participating in healthcare information exchange.
With a .HEALTH top-level domain there could be a defined set of registrars who are authorized to issue .HEALTH domain names. The benefits of doing this include:
Financial - The business model for registrars is already established, while there is no business model for other approaches being explored.
Leverages Existing Software Capabilities - The software for registering entities and making updates for domain names is well established. The use of DNS is well-known and can easily handle a national entity directory. DNS (along with "WhoIs") can be used by EHR systems.
Security - We could restrict query of the .HEALTH domain to other members of the .HEALTH domain, reducing its vulnerability to denial of service attacks and spamming.
3. The ELPD would embrace the Direct Project's implementation guide for storing digital certificates in, and retrieving digital certificates from, DNS.
As for the HIT Policy Committee’s request for standards supporting the discovery of demographic information and exchange capabilities, that functionality could be achieved using a decentralized approach. For example, the Standards Committee could specify that each organization needs to have a Uniform Resource Identifier (URI) where they list additional information about their organization, including their health information exchange send and receive capabilities (e.g. http://www.bidmc.HEALTH/services). Such an approach would be easy to maintain and would be extensible.
Thus, rather than try to invent new standards, processes, and business models, let's leverage the basic standards of the internet -a top-level domain, DNS, and URIs to create the Directory Services we need to enable Health Information Exchange.
As a next step, the Privacy and Security Workgroup will consider the possibilities of this strawman.
Based on the guiding principles for the HIT Standards Committee articulated in the first meetings of the committee - keep it simple, do not let perfection be the enemy of the good, design for the little guy, leverage the internet, and keep the burden/cost of implementation low, I'm convinced the notion of using a top-level domain, existing DNS standards and URIs to support health information exchange directories is worthy of serious consideration.
Monday 23 May 2011
Medical Device Data Systems
In February, the FDA issued an important rule on Medical Device Data Systems (MDDSs), categorizing them as subject to FDA Class I general controls.
What is an MDDSs?
MDDSs are data systems that transfer, store, convert according to preset specifications, or display medical device data without controlling or altering the function or parameters of any connected medical device—that is, any other device with which the MDDS shares data or from which the MDDS receives data.
What are FDA Device categories?
The Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 301 et seq.) establishes a comprehensive system for the regulation of medical devices intended for human use. Section 513 of the FD&C Act (21 U.S.C. 360c) establishes three categories (classes) of devices, depending on the regulatory controls needed to provide reasonable assurance of safety and effectiveness. The three categories of devices are class I (general controls), class II (special controls), and class III (premarket approval). General controls include requirements for registration, listing, adverse event reporting, and good manufacturing practice (quality system requirements) (21 U.S.C. 360c(a)(1)(A)). Special controls are controls that, in addition to general controls, are applicable to a class II device to help provide reasonable assurance of that device’s safety and effectiveness (21 U.S.C. 360c(a)(1)(B)).
A member of the legal community wrote me:
"John: I have been getting up to speed on the recent FDA rule governing Medical Device Data Systems. This rule would appear to regulate the development of interfaces between medical devices and hospital information systems. Have you or anyone on your team looked at this issue? "
I consulted one of the leading HIT vendors, which responded
"John: We have indeed studied the MDDS rule and after much deliberation, it does appear that vendor or healthcare organization developed black boxes or interfaces which store or transport data from a medical device to another database for use in clinical decision making, fall into the category of MDDS. (The EHR itself does is NOT fall into this designation).
We are preparing to register with FDA a series of interfaces such
as the following:
Lab Instrument results interface
Radiology/Cardiology PACS interfaces
Hemodynamic monitor interface
Dynamap interface
etc
The good news is that there are no 510K filings required but you do need to show that you follow Quality Management System protocols, such as ISO. We recently got ISO 9001:2008 certified in anticipation of more and more FDA regulations coming our way."
The regulation does include a review of the scope of the MDDS definition and notes CPOE and e-Prescribing are not MDDSs. However, the regulation should be studied by vendors and hospitals who build systems to identify the applications and modules that require registration with the FDA, adverse event reporting and possible organizational ISO 9001 certification as evidence of quality management.
The regulation strikes an interesting balance - how to encourage innovation while also requiring accountability for errors that result from software or hardware defects.
Definitely worth a read to ensure you are compliant!
What is an MDDSs?
MDDSs are data systems that transfer, store, convert according to preset specifications, or display medical device data without controlling or altering the function or parameters of any connected medical device—that is, any other device with which the MDDS shares data or from which the MDDS receives data.
What are FDA Device categories?
The Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 301 et seq.) establishes a comprehensive system for the regulation of medical devices intended for human use. Section 513 of the FD&C Act (21 U.S.C. 360c) establishes three categories (classes) of devices, depending on the regulatory controls needed to provide reasonable assurance of safety and effectiveness. The three categories of devices are class I (general controls), class II (special controls), and class III (premarket approval). General controls include requirements for registration, listing, adverse event reporting, and good manufacturing practice (quality system requirements) (21 U.S.C. 360c(a)(1)(A)). Special controls are controls that, in addition to general controls, are applicable to a class II device to help provide reasonable assurance of that device’s safety and effectiveness (21 U.S.C. 360c(a)(1)(B)).
A member of the legal community wrote me:
"John: I have been getting up to speed on the recent FDA rule governing Medical Device Data Systems. This rule would appear to regulate the development of interfaces between medical devices and hospital information systems. Have you or anyone on your team looked at this issue? "
I consulted one of the leading HIT vendors, which responded
"John: We have indeed studied the MDDS rule and after much deliberation, it does appear that vendor or healthcare organization developed black boxes or interfaces which store or transport data from a medical device to another database for use in clinical decision making, fall into the category of MDDS. (The EHR itself does is NOT fall into this designation).
We are preparing to register with FDA a series of interfaces such
as the following:
Lab Instrument results interface
Radiology/Cardiology PACS interfaces
Hemodynamic monitor interface
Dynamap interface
etc
The good news is that there are no 510K filings required but you do need to show that you follow Quality Management System protocols, such as ISO. We recently got ISO 9001:2008 certified in anticipation of more and more FDA regulations coming our way."
The regulation does include a review of the scope of the MDDS definition and notes CPOE and e-Prescribing are not MDDSs. However, the regulation should be studied by vendors and hospitals who build systems to identify the applications and modules that require registration with the FDA, adverse event reporting and possible organizational ISO 9001 certification as evidence of quality management.
The regulation strikes an interesting balance - how to encourage innovation while also requiring accountability for errors that result from software or hardware defects.
Definitely worth a read to ensure you are compliant!
Friday 20 May 2011
An FAQ on Exchanging Key Clinical Information
Yesterday, CMS posted an important FAQ clarifying the Meaningful Use requirement to exchange key clinical information.
Since the Standards and Certification final rule does not include any transport standards and no EHR has been tested or certified to comply with any particular transport capability, it was unclear how the CCDs/CCRs produced by certified EHRs should be exchanged as required to attest for meaningful use.
The CMS FAQ suggests that the exchange must be accomplished over an electronic network and not using fixed media:
"No, the use of physical media such as a CD-ROM, a USB or hard drive, or other formats to exchange key clinical information would not utilize the certification capability of certified EHR technology to electronically transmit the information, and therefore would not meet the measure of this objective"
Here's my advice - do an exchange of a single CCD/CCR via a secure website, secure FTP or secure email and you'll be fine.
Although 1) certification focused on content and vocabulary standards, not transport standards and 2) certified EHRs will not be able to tell the difference between a CCD/CCR received via a network or exchanged on media, CMS has given us guidance. Also, it's a best practice to avoid the use of media for protected health information, because having clinical data on mobile media is a security risk as noted by the OIG report.
Since the Standards and Certification final rule does not include any transport standards and no EHR has been tested or certified to comply with any particular transport capability, it was unclear how the CCDs/CCRs produced by certified EHRs should be exchanged as required to attest for meaningful use.
The CMS FAQ suggests that the exchange must be accomplished over an electronic network and not using fixed media:
"No, the use of physical media such as a CD-ROM, a USB or hard drive, or other formats to exchange key clinical information would not utilize the certification capability of certified EHR technology to electronically transmit the information, and therefore would not meet the measure of this objective"
Here's my advice - do an exchange of a single CCD/CCR via a secure website, secure FTP or secure email and you'll be fine.
Although 1) certification focused on content and vocabulary standards, not transport standards and 2) certified EHRs will not be able to tell the difference between a CCD/CCR received via a network or exchanged on media, CMS has given us guidance. Also, it's a best practice to avoid the use of media for protected health information, because having clinical data on mobile media is a security risk as noted by the OIG report.
Thursday 19 May 2011
How Do Vegans Get Enough Protein?
As a vegan, one of the more frequent questions I'm asked is "if you eat only plants, how do you get enough protein?"
A recent movie review of Forks Over Knives in the Boston Globe speculated that vegans must have a hard time with protein and essential nutrients.
Somehow the average consumer has forgotten that plants are filled with protein (i.e. have you ever heard the term "textured vegetable protein"?)
As a vegan for over 10 years, I've never had any issues with protein, necessary amino acids, or essential nutrients. I get everything I need from a simple balanced diet that includes protein rich plants such as spinach, soy, and peanuts.
Here's a useful resource about the protein content in vegetables.
How do vegans get enough protein? Just pass the spinach!
Here's a few favorite protein rich recipes.
About the only issue vegans have is getting enough B12. I need about a thimbleful per year! I can just get it from B12 found naturally in the topsoil that sticks to root vegetables (no matter how much you clean them) or take an occasional supplement extracted from yeast.
No meat, milk, or cheese required!
A recent movie review of Forks Over Knives in the Boston Globe speculated that vegans must have a hard time with protein and essential nutrients.
Somehow the average consumer has forgotten that plants are filled with protein (i.e. have you ever heard the term "textured vegetable protein"?)
As a vegan for over 10 years, I've never had any issues with protein, necessary amino acids, or essential nutrients. I get everything I need from a simple balanced diet that includes protein rich plants such as spinach, soy, and peanuts.
Here's a useful resource about the protein content in vegetables.
How do vegans get enough protein? Just pass the spinach!
Here's a few favorite protein rich recipes.
About the only issue vegans have is getting enough B12. I need about a thimbleful per year! I can just get it from B12 found naturally in the topsoil that sticks to root vegetables (no matter how much you clean them) or take an occasional supplement extracted from yeast.
No meat, milk, or cheese required!
Wednesday 18 May 2011
The May HIT Standards Committee meeting
The May HIT Standards Committee meeting focused on the schedule of work ahead to provide ONC with the standards needed for Meaningful Use Stage 2 regulation writing.
We began the meeting by reviewing the meeting topics for April to September, shown in the table below.
To ensure all the necessary standards are included in our project plan, we asked Paul Tang to present the latest proposed MU stage 2 criteria. His presentation included several new recommendations:
*CPOE requirements should be expanded to 60% of medication and lab orders as well as demonstration of radiology orders
*Clinicians should be able to refine drug/drug interaction alerts so that alerts are accurate and actionable. In Stage 3, EHRs should be able to access national lists of drug/drug interaction rules
*20% of hospital discharge medication orders should be e-prescribed
*Demographic capture should include expanded race/ethnicity as noted in the IOM Report Race, Ethnicity, and Language Data: Standardization for Health Care Quality Improvement.
*Stage 3 should include second hand smoke exposure as a tobacco use type
*40% of hospital labs sent to outpatient providers should be electronic and include LOINC vocabularies
*Eligible professionals should document electronic notes on 30% of visits. Hospitals should have electronic notes for 30% of patient days
*The Electronic Medication Administration Record should be in use
*Standards-based Family History documentation should be included in stage 3
*For Hospitals, greater than 25 patients should receive electronic discharge instructions and 10% of patients should view and download information about a hospital admission
*For Eligible Professionals, 10% of patients should view and download their health information
*10% of all patients should receive educational materials
*Eligible Professionals should use secure messaging with greater than 25 patients
*Eligible Professionals should record communication preferences (secure email, PHR, snail mail etc) for 20% of patients
*Stage 3 should include a mechanism for capturing patient entered data in the EHR
*Medication reconciliation should be done for 50% of care transitions
*Hospitals should send summary of care records to professionals or long term care facilities for 10% of all discharges
*Eligible professionals should send at least 25 care summaries to other providers electronically
*10% of patients should have a list of care team members (unstructured text for now, structured data for stage 3)
*A care plan should be included with summary of care transmissions
*Hospitals and eligible professionals should submit at least one live immunization transaction
*Hospitals should submit at least one live reportable lab transaction
*Hospitals should submit at least one live syndromic surveillance transaction
*Cancer conditions should be reported to registries by eligible professionals
*Data on mobile devices should be encrypted
These new Stage 2 requirements require new standards for
*Extended race/ethnicity codesets
*Discharge medication e-prescribing
*Extended smoking codeset
*Possible new data exchanges supporting electronic medication administration record workflow (need to clarify the scope of the EMAR requirement)
*Representing care plans
*Reporting cancer conditions
*Encrypting mobile devices
These will be assigned to workgroups and power teams.
Dixie Baker presented the Privacy and Security Workgroup report focusing on provider directories. The scope of this work includes entity (i.e. organization) level directory queries from EHRs. After much discussion, we concluded that the Direct project protocols for DNS query in support of certificate exchange are good enough for the short term, while web-based query/response connections to enterprise LDAP queries is a reasonable future direction. We'll work with the Policy Committee and ONC to refine the business requirements then produce a series of standards requirements as input to the S&I framework. The consensus of the committee is that a community based directory is helpful but not necessary for exchange, just as there is no national directory of email addresses, but yet we successfully exchange billions of email per year.
Jim Walker presented the Clinical Quality Workgroup report outlining the work ahead to provide CMS with the standards needed to support quality measures by August.
Judy Murphy and Liz Johnson presented Implementation Workgroup report including their certification experience survey.
Jamie Ferguson presented Clinical Operations Vocabulary Task Force report. The Task Force is specifying the vocabulary and codesets needed to accelerate semantic interoperability. The Implementation Workgroup and the Vocabulary Task Force will also make statements about certification queries such as support for Postel's Law - if a new vocabulary term is introduced, existing systems should continue to function.
Doug Fridsma led a discussion of the Summer Power Team activities
Stan Huff presented the Metadata Analysis Power Team report. There was general consensus that simple XML forms which support patient identity and provenance (who sent the message, when was it sent etc) using CDA R2 and X.509 signatures was good enough.
Marc Overhage submitted the Patient Matching Power Team report.
We will get updates on the Surveillance Implementation Guide (Chris Chute), e-prescribing of discharge needs (Jamie Ferguson) and NwHIN (Dixie Baker) at the next meeting.
A very productive meeting. The HIT Standards Committee is truly an effective team, representing varied interests but always able to chart a path forward that balances all points of view.
A very productive meeting. The HIT Standards Committee is truly an effective team, representing varied interests but always able to chart a path forward that balances all points of view.
Tuesday 17 May 2011
The Status of e-Prescribing in the US
On May 12, Surescripts released the National Progress Report on e-Prescribing and Interoperable Healthcare.
For the past 3 years, Massachusetts has led the country in e-prescribing due to the combined efforts of our payers and our healthcare information exchange. I follow the evolution of e-prescribing with great interest.
Key findings in the Surescripts report include:
Electronic Prescribing Use
* Prescription Benefit: Electronic responses to requests for prescription benefit information grew 125% from 188 million in 2009 to 423 million in 2010.
* Medication History: Prescription histories delivered to prescribers grew 184% from 81 million in 2009 to 230 million in 2010.
* Prescription Routing: Prescriptions routed electron- ically grew 72% from 191 million in 2009 to 326 mil- lion in 2010.
* EMR vs. Standalone E-Prescribing Software: About 79 percent of prescribers used EMRs in 2010, up from 70 percent in 2009.
Electronic Prescribing Adoption
* Prescribers: The number of prescribers routing prescriptions electronically grew from 156,000 at the end of 2009 to 234,000 by the end of 2010—representing about 34 percent of all office-based prescribers.
* Payers: At the end of 2010, Surescripts could provide access to prescription benefit and history information for more than 66 percent of patients in the U.S.
* Community and Mail Order Pharmacies: At the end of 2010, approximately 91 percent of community pharmacies in the U.S. were connected for prescription routing and six of the largest mail order pharmacies were able to receive prescriptions electronically.
Surprising findings to me include
*Family practitioners and small practices have high rates of e-prescribing compared to other specialties and practice sizes
*Specialists including cardiologists and ophthalmologists are using e-prescribing more often that I expected
E-prescribing is a unique interoperability success story. The standards are clear (NCPDP) and are required by regulation (Medicare Part D and the Standards and Certification Final Rule). Incentives are aligned (saves clinicians time and saves pharmacies money while making the entire process safer and more convenient for the patient). Let's hope our other interoperability efforts such as clinical summary exchange follow this same adoption trajectory over time as we provide unambiguous standards and change the culture to make interoperability an expectation of patients and providers.
For the past 3 years, Massachusetts has led the country in e-prescribing due to the combined efforts of our payers and our healthcare information exchange. I follow the evolution of e-prescribing with great interest.
Key findings in the Surescripts report include:
Electronic Prescribing Use
* Prescription Benefit: Electronic responses to requests for prescription benefit information grew 125% from 188 million in 2009 to 423 million in 2010.
* Medication History: Prescription histories delivered to prescribers grew 184% from 81 million in 2009 to 230 million in 2010.
* Prescription Routing: Prescriptions routed electron- ically grew 72% from 191 million in 2009 to 326 mil- lion in 2010.
* EMR vs. Standalone E-Prescribing Software: About 79 percent of prescribers used EMRs in 2010, up from 70 percent in 2009.
Electronic Prescribing Adoption
* Prescribers: The number of prescribers routing prescriptions electronically grew from 156,000 at the end of 2009 to 234,000 by the end of 2010—representing about 34 percent of all office-based prescribers.
* Payers: At the end of 2010, Surescripts could provide access to prescription benefit and history information for more than 66 percent of patients in the U.S.
* Community and Mail Order Pharmacies: At the end of 2010, approximately 91 percent of community pharmacies in the U.S. were connected for prescription routing and six of the largest mail order pharmacies were able to receive prescriptions electronically.
Surprising findings to me include
*Family practitioners and small practices have high rates of e-prescribing compared to other specialties and practice sizes
*Specialists including cardiologists and ophthalmologists are using e-prescribing more often that I expected
E-prescribing is a unique interoperability success story. The standards are clear (NCPDP) and are required by regulation (Medicare Part D and the Standards and Certification Final Rule). Incentives are aligned (saves clinicians time and saves pharmacies money while making the entire process safer and more convenient for the patient). Let's hope our other interoperability efforts such as clinical summary exchange follow this same adoption trajectory over time as we provide unambiguous standards and change the culture to make interoperability an expectation of patients and providers.
Monday 16 May 2011
Should We Abandon the Cloud?
It's been a bad month for the cloud.
First there was the major Amazon EC2 (Elastic Cloud) outage April 21-22 that brought down many business and websites. Some of the data was unrecoverable and transactions were lost.
Next, the May 10-13 outage of Microsoft's cloud based email and Office services (Business Productivity Online Suite) caused major angst among its customers who thought that the cloud offered increased reliability
Then we had the May 11-13 Google Blogger outage which brought down editing, commenting, and content for thousands of blogs.
Outages from the 3 largest providers of cloud services within a 2 week period does not bode well.
Yesterday, Twitter went down as well.
Many have suggested we abandon a cloud only strategy.
Should we abandon the cloud for healthcare? Absolutely not.
Should we reset our expectations that highly reliable, secure computing can be provided at very low cost by "top men" in the cloud? Absolutely yes.
I am a cloud provider. At my Harvard Medical School Data Center, I provide 4000 Cores and 2 petabytes of data to thousands of faculty and staff. At BIDMC, I provide 500 virtualized servers and a petabyte of data to 12,000 users. Our BIDPO/BIDMC Community EHR Private Cloud provides electronic health records to 300 providers.
I know what it takes to provide 99.999% uptime. Multiple redundant data centers, clustered servers, arrays of tiered storage, and extraordinary power engineering.
With all of this amazing infrastructure comes complexity. With complexity comes unanticipated consequences, change control challenges, and human causes of failure.
Let's look at the downtime I've had this year.
1. BIDMC has a highly redundant, geographically dispersed Domain Name System (DNS) architecture. It theory it should not be able to fail. In practice it did. The vendor was attempting to add features that would make us even more resilient. Instead of making changes to a test DNS appliance, they accidentally made changes to a production DNS appliance. We experienced downtime in several of our applications.
2. HMS has clustered thousands of computing cores together to create a highly robust community resource connected to a petabyte of distributed storage nodes. In theory is should be invincible. In practice it went down. A user with limited high performance computing experience launched a poorly written job to 400 cores in parallel that caused a core dump every second contending for the same disk space. Storage was overwhelmed and went offline for numerous applications.
3. BIDMC has a highly available cluster to support clinical applications. We've upgraded to the most advanced and feature rich Linux operating system. Unfortunately, it had a bug that when used in a very high performance clustered environment, the entire storage filesystem became unavailable. We had downtime.
4. BIDMC has one of the most sophisticated power management systems in the industry - every component is redundant. As we added features to make us even more redundant, we needed to temporarily reroute power, which is not an issue for us because every network router and switch has two power supplies. We had competed 4 of 5 data center switch migrations when the redundant power supply failed on the 5th switch, bringing down several applications.
5. The BIDPO EHR hosting center has a highly redundant and secure network. Unfortunately, bugs in the network operating system on some of the key components led to failure of all traffic to flow.
These examples illustrate that even the most well engineered infrastructure can fail due to human mistakes, operating system bugs, and unanticipated consequences of change.
The cloud is truly no different. Believing that Microsoft, Google, Amazon or anyone else can engineer perfection at low cost is fantasy. Technology is changing so fast and increasing demand requires so much change that every day is like replacing the wings on a 747 while it's flying. On occasion bad things will happen. We need to have robust downtime procedures and business continuity planning to respond to failures when they occur.
The idea of creating big data in the cloud, clusters of processors, and leveraging the internet to support software as a service applications is sound.
There will be problems. New approaches to troubleshooting issues in the cloud will make "diagnosis and treatment" of slowness and downtime faster.
Problems on a centralized cloud architecture that is homogenous, well documented, and highly staffed can be more rapidly resolved than problems in distributed, poorly staffed, one-off installations.
Thus, I'm a believer in the public cloud and private clouds. I will continue to use them for EHRs and other healthcare applications. However, I have no belief that the public cloud will have substantially less downtime or lower cost than I can engineer myself.
The reason to use the public cloud is so that my limited staff can spend their time innovating - creating infrastructure and applications that the public cloud has not yet envisioned or refuses to support because of regulatory requirements (such as HIPAA).
Despite the black cloud of the past two weeks, the future of the cloud, tempered by a dose of reality to reset expectations, is bright.
First there was the major Amazon EC2 (Elastic Cloud) outage April 21-22 that brought down many business and websites. Some of the data was unrecoverable and transactions were lost.
Next, the May 10-13 outage of Microsoft's cloud based email and Office services (Business Productivity Online Suite) caused major angst among its customers who thought that the cloud offered increased reliability
Then we had the May 11-13 Google Blogger outage which brought down editing, commenting, and content for thousands of blogs.
Outages from the 3 largest providers of cloud services within a 2 week period does not bode well.
Yesterday, Twitter went down as well.
Many have suggested we abandon a cloud only strategy.
Should we abandon the cloud for healthcare? Absolutely not.
Should we reset our expectations that highly reliable, secure computing can be provided at very low cost by "top men" in the cloud? Absolutely yes.
I am a cloud provider. At my Harvard Medical School Data Center, I provide 4000 Cores and 2 petabytes of data to thousands of faculty and staff. At BIDMC, I provide 500 virtualized servers and a petabyte of data to 12,000 users. Our BIDPO/BIDMC Community EHR Private Cloud provides electronic health records to 300 providers.
I know what it takes to provide 99.999% uptime. Multiple redundant data centers, clustered servers, arrays of tiered storage, and extraordinary power engineering.
With all of this amazing infrastructure comes complexity. With complexity comes unanticipated consequences, change control challenges, and human causes of failure.
Let's look at the downtime I've had this year.
1. BIDMC has a highly redundant, geographically dispersed Domain Name System (DNS) architecture. It theory it should not be able to fail. In practice it did. The vendor was attempting to add features that would make us even more resilient. Instead of making changes to a test DNS appliance, they accidentally made changes to a production DNS appliance. We experienced downtime in several of our applications.
2. HMS has clustered thousands of computing cores together to create a highly robust community resource connected to a petabyte of distributed storage nodes. In theory is should be invincible. In practice it went down. A user with limited high performance computing experience launched a poorly written job to 400 cores in parallel that caused a core dump every second contending for the same disk space. Storage was overwhelmed and went offline for numerous applications.
3. BIDMC has a highly available cluster to support clinical applications. We've upgraded to the most advanced and feature rich Linux operating system. Unfortunately, it had a bug that when used in a very high performance clustered environment, the entire storage filesystem became unavailable. We had downtime.
4. BIDMC has one of the most sophisticated power management systems in the industry - every component is redundant. As we added features to make us even more redundant, we needed to temporarily reroute power, which is not an issue for us because every network router and switch has two power supplies. We had competed 4 of 5 data center switch migrations when the redundant power supply failed on the 5th switch, bringing down several applications.
5. The BIDPO EHR hosting center has a highly redundant and secure network. Unfortunately, bugs in the network operating system on some of the key components led to failure of all traffic to flow.
These examples illustrate that even the most well engineered infrastructure can fail due to human mistakes, operating system bugs, and unanticipated consequences of change.
The cloud is truly no different. Believing that Microsoft, Google, Amazon or anyone else can engineer perfection at low cost is fantasy. Technology is changing so fast and increasing demand requires so much change that every day is like replacing the wings on a 747 while it's flying. On occasion bad things will happen. We need to have robust downtime procedures and business continuity planning to respond to failures when they occur.
The idea of creating big data in the cloud, clusters of processors, and leveraging the internet to support software as a service applications is sound.
There will be problems. New approaches to troubleshooting issues in the cloud will make "diagnosis and treatment" of slowness and downtime faster.
Problems on a centralized cloud architecture that is homogenous, well documented, and highly staffed can be more rapidly resolved than problems in distributed, poorly staffed, one-off installations.
Thus, I'm a believer in the public cloud and private clouds. I will continue to use them for EHRs and other healthcare applications. However, I have no belief that the public cloud will have substantially less downtime or lower cost than I can engineer myself.
The reason to use the public cloud is so that my limited staff can spend their time innovating - creating infrastructure and applications that the public cloud has not yet envisioned or refuses to support because of regulatory requirements (such as HIPAA).
Despite the black cloud of the past two weeks, the future of the cloud, tempered by a dose of reality to reset expectations, is bright.
Friday 13 May 2011
Cool Technology of the Week
How many times have you heard the complaint "the application is slow" but lack data about server, network, or desktop performance to facilitate diagnosis and resolution?
In a cloud environment, debugging application issues becomes ever more challenging.
As we all rollout EHRs to small provider offices, often with challenging internet connections, remote monitoring of cloud network performance becomes even more critical.
AppNeta's PathView microAppliance provides an easy to deploy zero administration network monitoring tool. It's about the same size as a cell phone and can be placed at remote business locations, requiring only power and an ethernet connection.
You place one of their devices in the locationyou want to monitor network activity. When you plug it into the wall and an ethernet connection, it uploads network performance data to AppNeta cloud servers. You can also place two or more devices in separate locations, and monitor the traffic between those locations. For EHR cloud providers, it's simple to configure the device, send it via UPS to a provider practice with instructions to plug into the wall, and gather performance data without complicated onsite network sniffing setups and configuration.
Currently we have deployed these devices at our central EHR private cloud site and two of major remote practices The level of detail and depth of available metrics and reports is amazing.
A low cost, zero administration, cloud-based, network sniffer that is truly plug and play. That's cool!
In a cloud environment, debugging application issues becomes ever more challenging.
As we all rollout EHRs to small provider offices, often with challenging internet connections, remote monitoring of cloud network performance becomes even more critical.
AppNeta's PathView microAppliance provides an easy to deploy zero administration network monitoring tool. It's about the same size as a cell phone and can be placed at remote business locations, requiring only power and an ethernet connection.
You place one of their devices in the locationyou want to monitor network activity. When you plug it into the wall and an ethernet connection, it uploads network performance data to AppNeta cloud servers. You can also place two or more devices in separate locations, and monitor the traffic between those locations. For EHR cloud providers, it's simple to configure the device, send it via UPS to a provider practice with instructions to plug into the wall, and gather performance data without complicated onsite network sniffing setups and configuration.
Currently we have deployed these devices at our central EHR private cloud site and two of major remote practices The level of detail and depth of available metrics and reports is amazing.
A low cost, zero administration, cloud-based, network sniffer that is truly plug and play. That's cool!
Thursday 12 May 2011
The Community Garden - Before and After
Last month, I posted my 2011 gardening plan including a design for a new community garden plot in Wellesley.
Here's what we started with:
After a few weekends of hauling debris, pounding fence posts, pouring concrete, and a bit of woodworking with Japanese handsaws, it's finished.
All of the improvements are now the property of the town of Wellesley, since the community garden is shared public space.
Here's what we did:
On April 30, we rented a U-Haul and moved 2000 pounds of rotting wood, metal scraps, plant debris, plastic, and paper from the community garden space to the Wellesley recycling center. We purchased 4x4 fence posts, T-posts, wire fencing, hardware cloth, and lumber for an arbor at Home Depot. We rototilled the space, dug post holes, and poured concrete. On May 1, we completed the wire fencing and built the arbor.
On May 7-8, we hauled soil and bark, created 5 raised beds, and planted our seeds/seedlings. We built a small bird house and added a thistle feeder. We added irrigation.
$500 and two weekends later, here's the result.
Trellis: Canadice and Himrod grapes
Long Fence: Mammoth Sunflowers
Rear Fence: Sweet peas (flowering) and morning glory
Bed #1 (near gate)
Early lettuce as green mulch
Cucumber (1 Midori)
Summer Squash (1 Kousa and 1 yellow)
Brussels Sprouts (6 center row)
Cabbage (3 purple and 3 savoy)
Romanesco Veronica Cauliflower
Broccoli (6)
Violet Cauliflower
Basil
Dill
Borage
Celery (1)
Nasturtium (1 corner)
Marigold Little Gem (3 corner)
Bed #2
Cutting flowers (calendula, zinnia, cosmos, bachelor buttons, nigella, salvia, stock, aster, snapdragon, gomphrena)
Nasturtium (1 corner)
Marigold Little Gem (3 corners)
Bed #3
Asparagus (20 Jersey Supreme one year crowns, and 50 Purple Passion)
Bed #4
Early lettuce as green mulch
Winter Squash (1 red kabocha, 1 delica kabocha)
“Gita” Long Bean (tower)
Rosemary
Costata Romanesco Zucchini (1)
3 Broccoli
carrots
Daikon
Borage
Celery (1)
Nasturtium (corner)
Marigold Little Gem (3 corners)
Bed #5 (near hose) - (use for Lettuce in early spring)
Early lettuce as green mulch
Tomatoes (1 Principe Borghese, 1 Costoluto Genovese, 1 German Striped)
Eggplant (1 Kermit, 1 Turkish Red, 4 Asian Wonder)
Husk groundcherry (6)
Peppers (1 Boris Banana, 1 Sweet Italian)
6 Kale
Onions
Borage
Sage
Basil
Nasturtium (1 corner)
Marigold Little Gem (3 corners)
I look forward to a bountiful harvest this season.
Here's what we started with:
After a few weekends of hauling debris, pounding fence posts, pouring concrete, and a bit of woodworking with Japanese handsaws, it's finished.
All of the improvements are now the property of the town of Wellesley, since the community garden is shared public space.
Here's what we did:
On April 30, we rented a U-Haul and moved 2000 pounds of rotting wood, metal scraps, plant debris, plastic, and paper from the community garden space to the Wellesley recycling center. We purchased 4x4 fence posts, T-posts, wire fencing, hardware cloth, and lumber for an arbor at Home Depot. We rototilled the space, dug post holes, and poured concrete. On May 1, we completed the wire fencing and built the arbor.
On May 7-8, we hauled soil and bark, created 5 raised beds, and planted our seeds/seedlings. We built a small bird house and added a thistle feeder. We added irrigation.
$500 and two weekends later, here's the result.
Trellis: Canadice and Himrod grapes
Long Fence: Mammoth Sunflowers
Rear Fence: Sweet peas (flowering) and morning glory
Bed #1 (near gate)
Early lettuce as green mulch
Cucumber (1 Midori)
Summer Squash (1 Kousa and 1 yellow)
Brussels Sprouts (6 center row)
Cabbage (3 purple and 3 savoy)
Romanesco Veronica Cauliflower
Broccoli (6)
Violet Cauliflower
Basil
Dill
Borage
Celery (1)
Nasturtium (1 corner)
Marigold Little Gem (3 corner)
Bed #2
Cutting flowers (calendula, zinnia, cosmos, bachelor buttons, nigella, salvia, stock, aster, snapdragon, gomphrena)
Nasturtium (1 corner)
Marigold Little Gem (3 corners)
Bed #3
Asparagus (20 Jersey Supreme one year crowns, and 50 Purple Passion)
Bed #4
Early lettuce as green mulch
Winter Squash (1 red kabocha, 1 delica kabocha)
“Gita” Long Bean (tower)
Rosemary
Costata Romanesco Zucchini (1)
3 Broccoli
carrots
Daikon
Borage
Celery (1)
Nasturtium (corner)
Marigold Little Gem (3 corners)
Bed #5 (near hose) - (use for Lettuce in early spring)
Early lettuce as green mulch
Tomatoes (1 Principe Borghese, 1 Costoluto Genovese, 1 German Striped)
Eggplant (1 Kermit, 1 Turkish Red, 4 Asian Wonder)
Husk groundcherry (6)
Peppers (1 Boris Banana, 1 Sweet Italian)
6 Kale
Onions
Borage
Sage
Basil
Nasturtium (1 corner)
Marigold Little Gem (3 corners)
I look forward to a bountiful harvest this season.
Wednesday 11 May 2011
On Becoming a Harvard Professor
Almost 15 years ago on June 15, 1996, I moved from California to Massachusetts. I began practicing Emergency Medicine at Beth Israel Deaconess Medical Center. On that day, I wrote in my journal:
"Today I've started work at one of the best hospitals in the country. I'm surrounded by smart people, amazing technology, and incredible possibilities. What am I, who am I, what will I be? I'm an instructor and the path to Harvard Professor seems insurmountable."
Today, I joined several friends and colleagues to celebrate my becoming a Harvard Professor.
Along the journey, I've learned many lessons. Professorship is not about fame, fortune, or what I know. It's about community. Early in my Harvard career, Dr. Tom Delbanco, Sam Fleming, Warren McFarlan, Marvin Schorr, and others advised me to focus on creating teams of smart people to change the world. From my discussion with Deans and faculty, here are the top 5 roles of a Harvard Professor:
*Training the next generation - I have 20 years left in my career. Now is the right time to develop the next generation of informatics and IT leaders by sharing my experience and giving them an opportunity to thrive. I'll do my best to inspire and mentor students, residents, fellows and junior faculty by always being available to them.
*Communicating ideas - publishing, lecturing, meeting, blogging, and serving on expert panels ensures that ideas and innovation are widely disseminated. Today's blog is my 900th post, creating a permanent record of the key ideas I encounter in my life as a healthcare CIO.
*Serving as role model - a strong sense of ethics and equanimity, always being moral and fair in every conversation and relationship, fosters an environment that encourages people to excel.
*Building teams - assembling and resourcing the best people, especially those with differing opinions and experiences, leads to innovation.
*Creating an ideal work and learning environment - accepting accountability for resolving personnel conflicts, budget shortfalls, strategic ambiguity, political barriers, and impediments to the free exchange of ideas empowers teams to succeed.
So now the next phase of my career begins. I feel humbled by the responsibility and will do my best to train, communicate, serve, build, and create!
"Today I've started work at one of the best hospitals in the country. I'm surrounded by smart people, amazing technology, and incredible possibilities. What am I, who am I, what will I be? I'm an instructor and the path to Harvard Professor seems insurmountable."
Today, I joined several friends and colleagues to celebrate my becoming a Harvard Professor.
Along the journey, I've learned many lessons. Professorship is not about fame, fortune, or what I know. It's about community. Early in my Harvard career, Dr. Tom Delbanco, Sam Fleming, Warren McFarlan, Marvin Schorr, and others advised me to focus on creating teams of smart people to change the world. From my discussion with Deans and faculty, here are the top 5 roles of a Harvard Professor:
*Training the next generation - I have 20 years left in my career. Now is the right time to develop the next generation of informatics and IT leaders by sharing my experience and giving them an opportunity to thrive. I'll do my best to inspire and mentor students, residents, fellows and junior faculty by always being available to them.
*Communicating ideas - publishing, lecturing, meeting, blogging, and serving on expert panels ensures that ideas and innovation are widely disseminated. Today's blog is my 900th post, creating a permanent record of the key ideas I encounter in my life as a healthcare CIO.
*Serving as role model - a strong sense of ethics and equanimity, always being moral and fair in every conversation and relationship, fosters an environment that encourages people to excel.
*Building teams - assembling and resourcing the best people, especially those with differing opinions and experiences, leads to innovation.
*Creating an ideal work and learning environment - accepting accountability for resolving personnel conflicts, budget shortfalls, strategic ambiguity, political barriers, and impediments to the free exchange of ideas empowers teams to succeed.
So now the next phase of my career begins. I feel humbled by the responsibility and will do my best to train, communicate, serve, build, and create!
Tuesday 10 May 2011
The Governor's Healthcare IT Conference
Yesterday, the Governor's Healthcare IT Conference included remarks from Massachusetts HHS Secretary Bigby, Former National Coordinator David Blumenthal, Governor Deval Patrick, Special Assistant to the Administrator of CMS Sachin Jain, and a panel of industry experts.
Here are the key points.
Secretary Bigby introduced the meeting by noting the importance of healthcare IT for increasing safety, quality, efficiency, patient engagement, and equity in healthcare across the Commonwealth.
David Blumenthal summarized the accomplishments of ONC over the past two years and highlighted the work left to be done. He noted that the HITECH act and its meaningful use constructs are a "downpayment" on healthcare reform, creating the the necessary infrastructure over years to enable changes in healthcare delivery and reimbursement. The trajectory that we're on for meaningful use includes three stages: stage 1 which aligns incentives for providers to adopt and use EHRs, stage 2 which provides the standards and tools to exchange data and stage 3 which provides decision support tools and analytics. In each stage, privacy protection is a high priority. Breach notification requirements have been enhanced and penalties for breaches have been levied.
Thus far, 700 healthcare IT products have been certified, many by companies with less than 50 employees. 36,000 providers have registered to participate in incentive programs. $64 million has already been paid to 500 organizations as part of the Medicaid incentive program. On May 18 the Medicare incentive payments begin. 56 state designated health information exchanges have been created and 56 state HIE coordinators have been named. 62 Regional Extension Centers have been created which have enrolled 67,000 providers. About 25% of all primary care clinicians in the country now participate in regional extension center programs.
There has been a market change - Meaningful Use is becoming an emblem of quality. 80% of all hospitals intend to participate in stage 1 of Meaningful Use. The challenges ahead are many - we need additional standards, enhanced technology, and additional policy. However, the major change we need is cultural. Communities need to demand and encourage data sharing for care coordination, public health, and other uses.
Deval Patrick's remarks demonstrated significant domain expertise about healthcare IT and health information exchange. He highlighted Massachusetts' pivotal role as a leader in HIT product development, job creation, health information exchange, policymaking, and training. He encouraged all of us to break down data silos and create data liquidity - accelerating data exchange among payers, providers, and patients regardless of organizational boundaries.
Sachin Jain highlighted the importance of the CMS Center for Innovation noting that it empowers the CMS administrator to expand local demonstration projects to national scale if there is evidence they improve quality/reduce cost. The $1 billion dollar Partnership for Patients program is a part of the CMS Center for Innovation.
We closed the day with panel session of healthcare IT stakeholders
Alice Coombs, MD, President, Massachusetts Medical Society
Karen Bell, MD, MMS, Chair, Certification Commission for Health Information Technology
Lynn Nicholas, President, Massachusetts Hospital Association
Charlotte Yeh, MD, Chief Medical Officer, AARP Services
Alice highlighted the need for usability of EHRs such that clinician workflow is aided, not impeded by technology.
Karen discussed the need for clinicians to look beyond basic federal certification and think about clinical decision support features, data portability, security protections and vendor commitments to usability.
Lynn noted that CPOE and other technologies can introduce errors and adverse events. We need to ensure the technology is implemented wisely and clinicians are appropriately trained.
Charlotte represented the needs of consumers and suggested we embrace technology that brings demonstrated value to patients. As we think about PHRs, home care devices, and patient engagement, we must evolve from actions done "to the patient" to "for the patient" to "with the patient".
The bottom line - we should ensure our EHRs have the functionality we need to support safe, quality, efficient care with health information exchange, decision support, and security protections. We want these applications to be highly useable and integrated into workflows. We want them to incorporate policies that enhance the patient and provider experience.
I also made remarks about the need for additional standards that will be done to enable all these goals. I'll expand on our "Summer Camp of Standards" work next week.
Here are the key points.
Secretary Bigby introduced the meeting by noting the importance of healthcare IT for increasing safety, quality, efficiency, patient engagement, and equity in healthcare across the Commonwealth.
David Blumenthal summarized the accomplishments of ONC over the past two years and highlighted the work left to be done. He noted that the HITECH act and its meaningful use constructs are a "downpayment" on healthcare reform, creating the the necessary infrastructure over years to enable changes in healthcare delivery and reimbursement. The trajectory that we're on for meaningful use includes three stages: stage 1 which aligns incentives for providers to adopt and use EHRs, stage 2 which provides the standards and tools to exchange data and stage 3 which provides decision support tools and analytics. In each stage, privacy protection is a high priority. Breach notification requirements have been enhanced and penalties for breaches have been levied.
Thus far, 700 healthcare IT products have been certified, many by companies with less than 50 employees. 36,000 providers have registered to participate in incentive programs. $64 million has already been paid to 500 organizations as part of the Medicaid incentive program. On May 18 the Medicare incentive payments begin. 56 state designated health information exchanges have been created and 56 state HIE coordinators have been named. 62 Regional Extension Centers have been created which have enrolled 67,000 providers. About 25% of all primary care clinicians in the country now participate in regional extension center programs.
There has been a market change - Meaningful Use is becoming an emblem of quality. 80% of all hospitals intend to participate in stage 1 of Meaningful Use. The challenges ahead are many - we need additional standards, enhanced technology, and additional policy. However, the major change we need is cultural. Communities need to demand and encourage data sharing for care coordination, public health, and other uses.
Deval Patrick's remarks demonstrated significant domain expertise about healthcare IT and health information exchange. He highlighted Massachusetts' pivotal role as a leader in HIT product development, job creation, health information exchange, policymaking, and training. He encouraged all of us to break down data silos and create data liquidity - accelerating data exchange among payers, providers, and patients regardless of organizational boundaries.
Sachin Jain highlighted the importance of the CMS Center for Innovation noting that it empowers the CMS administrator to expand local demonstration projects to national scale if there is evidence they improve quality/reduce cost. The $1 billion dollar Partnership for Patients program is a part of the CMS Center for Innovation.
We closed the day with panel session of healthcare IT stakeholders
Alice Coombs, MD, President, Massachusetts Medical Society
Karen Bell, MD, MMS, Chair, Certification Commission for Health Information Technology
Lynn Nicholas, President, Massachusetts Hospital Association
Charlotte Yeh, MD, Chief Medical Officer, AARP Services
Alice highlighted the need for usability of EHRs such that clinician workflow is aided, not impeded by technology.
Karen discussed the need for clinicians to look beyond basic federal certification and think about clinical decision support features, data portability, security protections and vendor commitments to usability.
Lynn noted that CPOE and other technologies can introduce errors and adverse events. We need to ensure the technology is implemented wisely and clinicians are appropriately trained.
Charlotte represented the needs of consumers and suggested we embrace technology that brings demonstrated value to patients. As we think about PHRs, home care devices, and patient engagement, we must evolve from actions done "to the patient" to "for the patient" to "with the patient".
The bottom line - we should ensure our EHRs have the functionality we need to support safe, quality, efficient care with health information exchange, decision support, and security protections. We want these applications to be highly useable and integrated into workflows. We want them to incorporate policies that enhance the patient and provider experience.
I also made remarks about the need for additional standards that will be done to enable all these goals. I'll expand on our "Summer Camp of Standards" work next week.
Monday 9 May 2011
Speed Dating for IT
As a CIO, I gather information about new products and innovation in many ways. I search the web for emerging technologies, read numerous publications/newsletters, and constantly meet with vendors and IT professionals who are creating novel applications.
However, it's not the most efficient way to rapidly assess whether products are operational or exist only in powerpoint.
BluePrint Healthcare IT - a company founded in 2003 to provide security, privacy, compliance and risk management services to hospitals and healthcare systems - has created a new approach to solve the problem of connecting early stage innovators and customers. They call it Speed Dating for IT.
Their BluePrint Health IT Innovation Summit Series is aligned with current innovation programs and initiatives sponsored by HHS and ONC promoting new technologies.
The idea is simple:
10 healthcare technology companies and 10 healthcare providers interact virtually within the framework of a "Health IT Innovation Matching System". Then, in one place at one time, those that match can meet for a dialog about piloting these new applications, realizing that there is risk but also market differentiation for those early adopters that achieve breakthrough results.
I like it - a kind of eHarmony for IT. I can pre-screen my vendors and we can determine if there is a fit before we spend time in meetings.
BIDMC will participate in the May 26 event. I'll let you know how Speed Dating for IT goes. My wife has given her approval.
However, it's not the most efficient way to rapidly assess whether products are operational or exist only in powerpoint.
BluePrint Healthcare IT - a company founded in 2003 to provide security, privacy, compliance and risk management services to hospitals and healthcare systems - has created a new approach to solve the problem of connecting early stage innovators and customers. They call it Speed Dating for IT.
Their BluePrint Health IT Innovation Summit Series is aligned with current innovation programs and initiatives sponsored by HHS and ONC promoting new technologies.
The idea is simple:
10 healthcare technology companies and 10 healthcare providers interact virtually within the framework of a "Health IT Innovation Matching System". Then, in one place at one time, those that match can meet for a dialog about piloting these new applications, realizing that there is risk but also market differentiation for those early adopters that achieve breakthrough results.
I like it - a kind of eHarmony for IT. I can pre-screen my vendors and we can determine if there is a fit before we spend time in meetings.
BIDMC will participate in the May 26 event. I'll let you know how Speed Dating for IT goes. My wife has given her approval.
Friday 6 May 2011
Cool Technology of the Week
As a glasses wearer for over 40 years, I've been an active user of many lens "technologies" Now that I'm nearly 50, I wear progressive lenses which ease my eye strain during screen time and close up work.
However, there is an issue - when I look down, I lose my distance vision. My prescription is -7 diopters so I cannot easily switch between two pairs of glasses, one for distance and one for closeup. An ideal bifocal would enable me to change the my glasses prescription in real time.
That's now possible with the PixelOptics electronic lens built with liquid crystal technology.
The lenses are made by Panasonic and change prescription on command, either via head movement or by activating a switch.
The hold a charge for 3 days.
Currently, they cost about $1000, but I expect that to come down as demand causes manufacturing scale to expand.
Glasses that change prescription on the fly. That's cool!
However, there is an issue - when I look down, I lose my distance vision. My prescription is -7 diopters so I cannot easily switch between two pairs of glasses, one for distance and one for closeup. An ideal bifocal would enable me to change the my glasses prescription in real time.
That's now possible with the PixelOptics electronic lens built with liquid crystal technology.
The lenses are made by Panasonic and change prescription on command, either via head movement or by activating a switch.
The hold a charge for 3 days.
Currently, they cost about $1000, but I expect that to come down as demand causes manufacturing scale to expand.
Glasses that change prescription on the fly. That's cool!
Thursday 5 May 2011
I Could Have Had a V8
For the past 10 years, I've kayaked the Charles River several times a week between April and October. Rather than owning a kayak, I've purchased a season pass from Charles River Canoe and Kayak.
This year, I found a kayak with the ideal combination of speed, size, weight, stability, and workmanship - the Epic V8 Surfski (pictured above).
In the past, I've considered products from KayakPro, Think, and Epic kayaks such as the V10.
Each was lacking something. The Epic V8 has it all.
*It's fast, enabling me to maintain a 6 mph pace
*It fits in my 19 foot garage, while most other surfskis are longer than 20 feet
*It weighs 30 pounds, so I can easily take it on and off the car myself
*It's stable in rough, windy conditions, even when speeding bass fishermen create 3 foot wakes
*It's a high quality boat with excellent engineering and kevlar/carbon materials at a reasonable price
It's taken a decade of waiting for this perfect design, but I've finally purchased my own kayak. Now I'll never need to make the statement, "I could have had a V8".
This year, I found a kayak with the ideal combination of speed, size, weight, stability, and workmanship - the Epic V8 Surfski (pictured above).
In the past, I've considered products from KayakPro, Think, and Epic kayaks such as the V10.
Each was lacking something. The Epic V8 has it all.
*It's fast, enabling me to maintain a 6 mph pace
*It fits in my 19 foot garage, while most other surfskis are longer than 20 feet
*It weighs 30 pounds, so I can easily take it on and off the car myself
*It's stable in rough, windy conditions, even when speeding bass fishermen create 3 foot wakes
*It's a high quality boat with excellent engineering and kevlar/carbon materials at a reasonable price
It's taken a decade of waiting for this perfect design, but I've finally purchased my own kayak. Now I'll never need to make the statement, "I could have had a V8".
Wednesday 4 May 2011
Breach Fatigue
You've read about the Sony privacy breach, the Epsilon email compromise, and recent high profile privacy breach settlements.
Every day the headlines are filled with so many such security issues that it almost seems like background noise. Just as too much decision support can result in alert fatigue and too many false alarms can result in alarm fatigue, the barrage of security breach news can lead to breach fatigue, causing you to let down your guard. Forewarned is forearmed, so push aside your breach fatigue and plan for the day when you will have to run your own breach notification. Here's a task list to guide you:
Immediate response actions
Report to Police Department
Notify Legal Counsel
Notify Privacy Officer
Notify CEO
Notify Clinical and IT Leadership
Notify Board of Directors
Notify Liability Insurer
Develop action plan
Analysis
Inventory unsecured data
Draft Risk Assessment rules (what data in combination is reportable i.e. name + social security number)
Finalize Risk Assessment rules
Conduct Risk Assessment
Complete Risk Assessment Report
Complete Reporting Requirements Report
Regulatory Reporting and Notifications
Define practice strategy/approach
Initial communication with practices
Notifications
Draft notification to Media
Oral notification to federal/state authorities including approval of notices
Office of Civil Rights
Attorney General
Office of Consumer Affairs
Practice approval of media notification
Distribute notification to media
Complete Practice specific spreadsheets
Choose credit monitoring service
Complete credit monitoring service contract
Prepare Patient Notices
Practice related activities
Initial call
Follow-up visit scheduled
Practice packages complete
Practice packages delivered to practice
Re-identification visits scheduled (to notify patients, you'll need addresses which may not be included in the actual data breached)
Re-identification complete
Patient notifications complete
Patient notifications sent
Attorney General reports filed
Office of Consumer Affairs reports filed
Office of Civil Rights reports filed
Communications
Prepare talking points for various channels
Staff a communication office (approximately 10% of notified patients will call)
Remediation
Cross-Organizational Review of processes and procedures which led to the breach
Remediation of root causes
Security policy updates as needed
Laptop encryption as needed
Additional training as needed
Follow the advice of your privacy officer and your legal counsel completely. Be transparent. Over communicate. Use the event as a teachable moment for your organization and your community. Be humble and apologize. Protect the patients and the providers.
As we continue the journey toward automation of electronic records to enhance safety and quality, we must retain the trust of our patients. Following the plan above will go far to address those events that occur as we all learn how to be better protectors of the data we host.
Every day the headlines are filled with so many such security issues that it almost seems like background noise. Just as too much decision support can result in alert fatigue and too many false alarms can result in alarm fatigue, the barrage of security breach news can lead to breach fatigue, causing you to let down your guard. Forewarned is forearmed, so push aside your breach fatigue and plan for the day when you will have to run your own breach notification. Here's a task list to guide you:
Immediate response actions
Report to Police Department
Notify Legal Counsel
Notify Privacy Officer
Notify CEO
Notify Clinical and IT Leadership
Notify Board of Directors
Notify Liability Insurer
Develop action plan
Analysis
Inventory unsecured data
Draft Risk Assessment rules (what data in combination is reportable i.e. name + social security number)
Finalize Risk Assessment rules
Conduct Risk Assessment
Complete Risk Assessment Report
Complete Reporting Requirements Report
Regulatory Reporting and Notifications
Define practice strategy/approach
Initial communication with practices
Notifications
Draft notification to Media
Oral notification to federal/state authorities including approval of notices
Office of Civil Rights
Attorney General
Office of Consumer Affairs
Practice approval of media notification
Distribute notification to media
Complete Practice specific spreadsheets
Choose credit monitoring service
Complete credit monitoring service contract
Prepare Patient Notices
Practice related activities
Initial call
Follow-up visit scheduled
Practice packages complete
Practice packages delivered to practice
Re-identification visits scheduled (to notify patients, you'll need addresses which may not be included in the actual data breached)
Re-identification complete
Patient notifications complete
Patient notifications sent
Attorney General reports filed
Office of Consumer Affairs reports filed
Office of Civil Rights reports filed
Communications
Prepare talking points for various channels
Staff a communication office (approximately 10% of notified patients will call)
Remediation
Cross-Organizational Review of processes and procedures which led to the breach
Remediation of root causes
Security policy updates as needed
Laptop encryption as needed
Additional training as needed
Follow the advice of your privacy officer and your legal counsel completely. Be transparent. Over communicate. Use the event as a teachable moment for your organization and your community. Be humble and apologize. Protect the patients and the providers.
As we continue the journey toward automation of electronic records to enhance safety and quality, we must retain the trust of our patients. Following the plan above will go far to address those events that occur as we all learn how to be better protectors of the data we host.
Tuesday 3 May 2011
Meaningful Use Payments
Now that eligible professionals and hospitals are attesting to Meaningful Use, they are asking how and when incentives payments will be made. Here's the answer from CMS:
For eligible professionals (EPs), incentive payments for the Medicare EHR Incentive Program will be made approximately four to eight weeks after an EP successfully attests that they have demonstrated meaningful use of certified EHR technology. However, EPs will not receive incentive payments within that timeframe if they have not yet met the threshold for allowed charges for covered professional services furnished by the EP during the year. Payments will be held until the EP meets the $24,000 threshold in allowed charges for calendar year 2011 in order to maximize the amount of the EHR incentive payment they receive. If the EP has not met the $24,000 threshold in allowed charges by the end of calendar year 2011, CMS expects to issue an incentive payment for the EP in March 2012 (allowing 60 days after the end of the 2011 calendar year for all pending claims to be processed).
Payments to Medicare EPs will be made to the taxpayer identification number (TIN) selected at the time of registration, through the same channels their claims payments are made. The form of payment (electronic funds transfer or check) will be the same as claims payments.
Bonus payments for EPs who practice predominantly in a geographic Health Professional Shortage Area (HPSA) will be made as separate lump-sum payments no later than 120 days after the end of the calendar year for which the EP was eligible for the bonus payment.
Please note that the 90-day reporting period an EP selects does not affect the amount of the EHR incentive payments. The Medicare EHR incentive payments to EPs are based on 75% of the estimated allowed charges for covered professional services furnished by the EP during the entire payment year. If the EP has not met the $24,000 threshold in allowed charges at the time of attestation, CMS will hold the incentive payment until the EP meets the threshold as described above.
Medicare EHR incentive payments to eligible hospitals and critical access hospitals (CAHs) will also be made approximately four to eight weeks after the eligible hospital or CAH successfully attests to having demonstrated meaningful use of certified EHR technology. Eligible hospitals and CAHs will receive an initial payment and a final payment. Eligible hospitals and CAHs that attest in April can receive their initial payment as early as May 2011. Final payment will be determined at the time of settling the hospital cost report.
Please note that the Medicaid incentives will be paid by the States, but the timing will vary according to State. Please contact your State Medicaid Agency for more details about payment.
For more information about the Medicare and Medicaid EHR Incentive Program, visit the website.
For an overview, see the Medicare Learning Network (MLN) Matters Special Edition article (SE1111) – Medicare Electronic Health Record (EHR) Incentive Payment Process.
IMPORTANT NOTE: Medicare Administration Contractors (MACs), carriers, and Fiscal Intermediaries (FIs) will not be making Medicare EHR incentive payments. CMS has contracted with a Payment File Development Contractor to make these payments.
DON'T: Call your MAC/Carrier/FI with questions about your EHR incentive payment.
INSTEAD: Call the EHR Information Center
Hours of Operation: 7:30 a.m. – 6:30 p.m. (Central Time) Monday through Friday, except federal holidays.
1-888-734-6433 (primary number) or 888-734-6563 (TTY number).
For eligible professionals (EPs), incentive payments for the Medicare EHR Incentive Program will be made approximately four to eight weeks after an EP successfully attests that they have demonstrated meaningful use of certified EHR technology. However, EPs will not receive incentive payments within that timeframe if they have not yet met the threshold for allowed charges for covered professional services furnished by the EP during the year. Payments will be held until the EP meets the $24,000 threshold in allowed charges for calendar year 2011 in order to maximize the amount of the EHR incentive payment they receive. If the EP has not met the $24,000 threshold in allowed charges by the end of calendar year 2011, CMS expects to issue an incentive payment for the EP in March 2012 (allowing 60 days after the end of the 2011 calendar year for all pending claims to be processed).
Payments to Medicare EPs will be made to the taxpayer identification number (TIN) selected at the time of registration, through the same channels their claims payments are made. The form of payment (electronic funds transfer or check) will be the same as claims payments.
Bonus payments for EPs who practice predominantly in a geographic Health Professional Shortage Area (HPSA) will be made as separate lump-sum payments no later than 120 days after the end of the calendar year for which the EP was eligible for the bonus payment.
Please note that the 90-day reporting period an EP selects does not affect the amount of the EHR incentive payments. The Medicare EHR incentive payments to EPs are based on 75% of the estimated allowed charges for covered professional services furnished by the EP during the entire payment year. If the EP has not met the $24,000 threshold in allowed charges at the time of attestation, CMS will hold the incentive payment until the EP meets the threshold as described above.
Medicare EHR incentive payments to eligible hospitals and critical access hospitals (CAHs) will also be made approximately four to eight weeks after the eligible hospital or CAH successfully attests to having demonstrated meaningful use of certified EHR technology. Eligible hospitals and CAHs will receive an initial payment and a final payment. Eligible hospitals and CAHs that attest in April can receive their initial payment as early as May 2011. Final payment will be determined at the time of settling the hospital cost report.
Please note that the Medicaid incentives will be paid by the States, but the timing will vary according to State. Please contact your State Medicaid Agency for more details about payment.
For more information about the Medicare and Medicaid EHR Incentive Program, visit the website.
For an overview, see the Medicare Learning Network (MLN) Matters Special Edition article (SE1111) – Medicare Electronic Health Record (EHR) Incentive Payment Process.
IMPORTANT NOTE: Medicare Administration Contractors (MACs), carriers, and Fiscal Intermediaries (FIs) will not be making Medicare EHR incentive payments. CMS has contracted with a Payment File Development Contractor to make these payments.
DON'T: Call your MAC/Carrier/FI with questions about your EHR incentive payment.
INSTEAD: Call the EHR Information Center
Hours of Operation: 7:30 a.m. – 6:30 p.m. (Central Time) Monday through Friday, except federal holidays.
1-888-734-6433 (primary number) or 888-734-6563 (TTY number).
Monday 2 May 2011
What Keeps Me Up at Night in the Data Center
Last week, I keynoted the Markley Group annual meeting and spoke about the data center issues that keep me up at night.
1. At Harvard Medical School, increasing amounts of research is done in "silicon" instead of wet labs. The growth in demand is unpredictable and bursty. When grants are funded, demand for new equipment is instantaneous. Data centers often have fixed real estate, limited power, and constrained capital budgets for expansion, making unplanned expansion problematic.
2. There is zero tolerance for downtime in the face of constantly changing technologies. We need to continuously innovate, providing the latest technology while maintaining existing systems at high levels of reliability.
3. Power and cooling needs are increasing exponentially. We've already virtualized all our application servers and we're beginning to virtualize database servers. Virtualizing high performance computing nodes does not really help since those nodes require maximal raw processing power. Harvard Medical School's compute cluster has 6000 cores. Our data center infrastructure needs optimal power usage efficiency to minimize energy costs.
4. Storage demand is now multi-petabyte. Drive density is increasing and costs are falling, but backing up and archiving petabytes is still a challenge.
5. Regulatory and compliance requirements now require searching and e-discovery of increasingly complex data stores. Although most healthcare organizations typically do not face Sarbanes-Oxley reporting requirements, other requirements such as HIPAA, ARRA/HITECH, and the Affordable Care Act have their own data retention and analysis implications.
My solution to many of these issues has been to create "elastic data centers" using external hosting facilities such as those provided by the Markley Group. Harvard Medical School has two such floors - a "low density" 5kw/rack 1000 square foot floor with an option to expand to 5000 share feet and a "high density" 30kw/rack floor with unlimited expansion capabilities. This flexibility enables me to shift the burden of power and cooling planning to someone else, while enabling me to serve my customers on demand.
BIDMC's EHR hosting center is another example of an elastic data center. We provide a private cloud with eClinical Works EHR offered via a Software as a Service model. The problem is that we do not know how many clinicians we'll support over time, so we contracted for an outsourced hosting center with easy expandability.
What will the next few years bring in data centers? My prediction is that
• On demand storage and compute cycles from private cloud facilities will become commonplace
• Clusters and Grids will enable communities of collaborators to flexibly share processing power
• Green Data Centers with Power Usage Effectiveness less than 1.50 will reduce the rate of growth of data center energy costs
• HIPAA compliant private clouds will evolve to enables EHRs and other person identified data to be hosted in the cloud
• The amount of storage and compute cycles needed to meet increasing demands will strain existing hospital-owned data centers, resulting in more elastic data centers hosted externally.
It's an exciting time to be in IT!
1. At Harvard Medical School, increasing amounts of research is done in "silicon" instead of wet labs. The growth in demand is unpredictable and bursty. When grants are funded, demand for new equipment is instantaneous. Data centers often have fixed real estate, limited power, and constrained capital budgets for expansion, making unplanned expansion problematic.
2. There is zero tolerance for downtime in the face of constantly changing technologies. We need to continuously innovate, providing the latest technology while maintaining existing systems at high levels of reliability.
3. Power and cooling needs are increasing exponentially. We've already virtualized all our application servers and we're beginning to virtualize database servers. Virtualizing high performance computing nodes does not really help since those nodes require maximal raw processing power. Harvard Medical School's compute cluster has 6000 cores. Our data center infrastructure needs optimal power usage efficiency to minimize energy costs.
4. Storage demand is now multi-petabyte. Drive density is increasing and costs are falling, but backing up and archiving petabytes is still a challenge.
5. Regulatory and compliance requirements now require searching and e-discovery of increasingly complex data stores. Although most healthcare organizations typically do not face Sarbanes-Oxley reporting requirements, other requirements such as HIPAA, ARRA/HITECH, and the Affordable Care Act have their own data retention and analysis implications.
My solution to many of these issues has been to create "elastic data centers" using external hosting facilities such as those provided by the Markley Group. Harvard Medical School has two such floors - a "low density" 5kw/rack 1000 square foot floor with an option to expand to 5000 share feet and a "high density" 30kw/rack floor with unlimited expansion capabilities. This flexibility enables me to shift the burden of power and cooling planning to someone else, while enabling me to serve my customers on demand.
BIDMC's EHR hosting center is another example of an elastic data center. We provide a private cloud with eClinical Works EHR offered via a Software as a Service model. The problem is that we do not know how many clinicians we'll support over time, so we contracted for an outsourced hosting center with easy expandability.
What will the next few years bring in data centers? My prediction is that
• On demand storage and compute cycles from private cloud facilities will become commonplace
• Clusters and Grids will enable communities of collaborators to flexibly share processing power
• Green Data Centers with Power Usage Effectiveness less than 1.50 will reduce the rate of growth of data center energy costs
• HIPAA compliant private clouds will evolve to enables EHRs and other person identified data to be hosted in the cloud
• The amount of storage and compute cycles needed to meet increasing demands will strain existing hospital-owned data centers, resulting in more elastic data centers hosted externally.
It's an exciting time to be in IT!
Friday 29 April 2011
Cool Technology of the Week
A major theme in healthcare IT lately has been the value of unstructured healthcare data, which can be mined using natural language processing and search technologies to produce meaningful knowledge
Although the transformation of unstructured data into structured data is a new concept in healthcare, there's a commercial website that illustrates its power - Tripit.com
TripIt, it is an itinerary consolidation and sharing tool that's very simple to use. You email any trip confirmations (air, car, hotel etc) to plans@tripit.com and TripIt combines all of the elements into one itinerary. That itinerary can be then saved to your calendar, viewed on the web, accessed via mobile devices and shared with others.
There are three major functions – itinerary collation, itinerary management and itinerary sharing.
To test itinerary collation, I emailed an Expedia confirmation from an upcoming Alaska trip (I'm keynoting a HIMSS event in Anchorage in June then climbing for a few days). The free text was transformed perfectly into the structured data shown in the graphic above, including automatic weather and map information. There's an iPhone, Android and Blackberry app to access this structured data via mobile devices. The iPhone app worked perfectly.
I use Apple Mail and by simply clicking on the calendar integration feature of Tripit, my full itinerary was automatically added to my calendar
I shared my itinerary with my wife by inviting her to join Tripit via her gmail account. I also added my Tripit itinerary to my Facebook wall.
A natural language processing application that turns unstructured confirmation emails into web, mobile and social networking accessible structured data. That's cool!
Although the transformation of unstructured data into structured data is a new concept in healthcare, there's a commercial website that illustrates its power - Tripit.com
TripIt, it is an itinerary consolidation and sharing tool that's very simple to use. You email any trip confirmations (air, car, hotel etc) to plans@tripit.com and TripIt combines all of the elements into one itinerary. That itinerary can be then saved to your calendar, viewed on the web, accessed via mobile devices and shared with others.
There are three major functions – itinerary collation, itinerary management and itinerary sharing.
To test itinerary collation, I emailed an Expedia confirmation from an upcoming Alaska trip (I'm keynoting a HIMSS event in Anchorage in June then climbing for a few days). The free text was transformed perfectly into the structured data shown in the graphic above, including automatic weather and map information. There's an iPhone, Android and Blackberry app to access this structured data via mobile devices. The iPhone app worked perfectly.
I use Apple Mail and by simply clicking on the calendar integration feature of Tripit, my full itinerary was automatically added to my calendar
I shared my itinerary with my wife by inviting her to join Tripit via her gmail account. I also added my Tripit itinerary to my Facebook wall.
A natural language processing application that turns unstructured confirmation emails into web, mobile and social networking accessible structured data. That's cool!
Thursday 28 April 2011
My 2011 Garden Plan
It's Spring in New England and I'm preparing my gardens.
This year, I planted oak leaf lettuce and spinach in a cold frame and selected seeds for a Summer raised bed garden of eggplant, cucumbers,peas, beans, and heirloom cherry tomatoes.
5 years ago, my wife and I joined the waiting list for a space in the Wellesley Community Garden on Brookside Road. We were just notified that we'll be granted a space this year. This means that we'll have a 32 x 25 foot plot to share with another family. Our plan is to install several raised beds and plant Japanese pumpkins (Kabocha) and other vegetables that require generous amount of sunny, well-drained flat ground that we do not have in our backyard because of the 100 foot hemlocks causing shade much of the year.
All our seeds come from the Kitazawa Seed Company, a truly remarkable supplier.
For the next few weekends, I'll be tilling soil, hauling mulch, building fences, and installing raised beds. My plan for new fencing to keep rabbits, squirrels and chipmunks from eating our fresh produce is pictured above. I found two great design resources - one about wire fencing and one about raised beds.
We've lived in New England for 15 growing seasons so I've learned not to plant tender seedlings until after mid May. It's still possible to have a hard freeze in April despite the temptation to plant induced by occasional 70 degree days.
As my daughter goes off to college and we enter the next stage of life (51-60), the time in our backyard garden and our new community garden space will be very therapeutic.
The rituals of the planting/harvesting cycle, the anticipation of fresh vegetables, and physical labor of small scale farming melt away all the problems of the week. I look forward to a weekend in the dirt!
This year, I planted oak leaf lettuce and spinach in a cold frame and selected seeds for a Summer raised bed garden of eggplant, cucumbers,peas, beans, and heirloom cherry tomatoes.
5 years ago, my wife and I joined the waiting list for a space in the Wellesley Community Garden on Brookside Road. We were just notified that we'll be granted a space this year. This means that we'll have a 32 x 25 foot plot to share with another family. Our plan is to install several raised beds and plant Japanese pumpkins (Kabocha) and other vegetables that require generous amount of sunny, well-drained flat ground that we do not have in our backyard because of the 100 foot hemlocks causing shade much of the year.
All our seeds come from the Kitazawa Seed Company, a truly remarkable supplier.
For the next few weekends, I'll be tilling soil, hauling mulch, building fences, and installing raised beds. My plan for new fencing to keep rabbits, squirrels and chipmunks from eating our fresh produce is pictured above. I found two great design resources - one about wire fencing and one about raised beds.
We've lived in New England for 15 growing seasons so I've learned not to plant tender seedlings until after mid May. It's still possible to have a hard freeze in April despite the temptation to plant induced by occasional 70 degree days.
As my daughter goes off to college and we enter the next stage of life (51-60), the time in our backyard garden and our new community garden space will be very therapeutic.
The rituals of the planting/harvesting cycle, the anticipation of fresh vegetables, and physical labor of small scale farming melt away all the problems of the week. I look forward to a weekend in the dirt!
Wednesday 27 April 2011
National Strategy for Trusted Identities in Cyberspace
On April 15, 2011, the Whitehouse released the National Strategy for Trusted Identities in Cyberspace (NSTIC) during a launch event that included U.S. Sec. of Commerce Gary Locke, other Administration officials, and U.S. Senator Barbara Mikulski, as well as a panel discussion with private sector, consumer advocate, and government ID management experts.
What is it a trusted identity in Cyberspace? This animation describes the scope of the effort. It includes smartcards, biometrics, soft tokens, hard tokens, and certificate management applications.
NSTIC envisions a cyber world - the Identity Ecosystem - that improves upon the passwords currently used to access electronic resources. It includes a vibrant marketplace that allows people to choose among multiple identity providers - both private and public - that will issue trusted credentials proving identity.
Why do we need it?
NSTIC provides a framework for individuals and organizations to utilize secure, efficient, easy-to-use and interoperable identity solutions to access online services in a manner that promotes confidence, privacy, choice and innovation.
Shopping, banking, social networking, and accessing employee intranets result in greater opportunities for innovation and economic growth, but the online infrastructure for supporting these services has not evolved at the same pace. The National Strategy for Trusted Identities in Cyberspace addresses two central problems impeding economic growth online - 1) Passwords are inconvenient and insecure
2) Individuals are unable to prove their true identity online for significant transactions.
Identity theft is costly, inconvenient and all-too common
*In 2010, 8.1 million U.S. adults were the victims of identity theft or fraud, with total costs of $37 billion.
*The average out-of-pocket loss of identity theft in 2008 was $631 per incident.
*Consumers reported spending an average of 59 hours recovering from a “new account” instance of ID theft.
Phishing continues to rise, with attacks becoming more sophisticated
*In 2008 and 2009, specific brands or entities were targeted by more than 286,000 phishing attacks, all attempting to replicate their site and harvest user credentials.
*A 2009 report from Trusteer found that 45% of targets divulge their personal information when redirected to a phishing site, and that financial institutions are subjected to an average of 16 phishing attacks per week, costing them between $2.4 and $9.4 million in losses each year.5
Managing multiple passwords is expensive
*A small business of 500 employees spends approximately $110,000 per year on password management. That’s $220 per user per year.
Passwords are failing
*In December 2009, the Rockyou password breach revealed the vulnerability of passwords. Nearly 50% of users’ passwords included names, slang words, dictionary words or were extremely weak, with passwords like “123456”.
Maintenance of multiple accounts is increasing as more services move online
*One federal agency with 44,000 users discovered over 700,000 user accounts, with the average user having 16 individual accounts.
Improving identity practices makes a difference
*Implementation of strong credentials across the Department of Defense resulted in a 46% reduction in intrusions.
*Use of single sign-on technologies can reduce annual sign-in time by 50 hours/user/year.
The next step is creation of a national program office to manage the project and coordinate public-private efforts. I look forward to a voluntary, opt in strong identity for e-commerce. Who knows, if this effort is successful, maybe we can move forward with a voluntary, opt in strong identity for healthcare.
Tuesday 26 April 2011
Business Spam
Our Proofpoint Spam filters remove the Nigerian businessmen and Viagra ads from my email stack. However, it's really challenging to auto-delete legitimate business email from major companies that I would just rather not read.
Business Spam (BS) is what I call the endless stream of chaff filling my inbox with sales and marketing fluff. If a colleague emails me about a cool new emerging technology, I'm happy. If a trusted business partner gives me a preview of a new product and offers me the opportunity to beta test it, I'm thrilled. If Bob at XYZ.com describes their cloud-based, software as service, offshore, outsourced, app store compliant product line that's compiled in powerpoint (i.e. does not yet exist except in sales and marketing materials), I press delete as fast as I can.
Since there are multiple domains that can be used to reach me - bidmc.harvard.edu, caregroup.harvard.edu, caregroup.org etc. many email list sellers vend 5 or 6 variations of my email address, resulting in 5 or 6 copies of each life changing offer in my inbox.
Now I know why some say email is dead. Email is a completely democratic medium. Anyone can email anyone. There are no ethical or common sense filters. The result is that Business Spam will soon outnumber my legitimate email.
Social networking architectures offer an alternative. I'm on Facebook, Twitter, LinkedIn, Plaxo etc. In those applications, individuals request access to me. Based on their relationships to my already trusted colleagues and my assessment of their character, I either allow or deny access. Once I "friend" them, appropriate communications can flow. If the dialog becomes burdensome or inappropriate, I can "block" them.
In order to stay relevant, email needs to incorporate social networking-like features. It should be easy to block individuals, companies, or domains that I do not want to hear from. Today, when a vendor ignores my pleas to remove me from their emailing list (demonstrating a lack of compliance with anti-spamming policies), I ask our email system administrator to blacklist their entire domain, preventing the flow of their Business Spam across the enterprise.
For those of you who use unsolicited business email as a marketing technique, beware. Your message is not only diluted by the sheer volume of companies generating Business Spam, but it also creates a negative impression among your recipients.
My advice - send your customers a newsletter describing your products and services. Ask them to opt in to receive future messages. If they do not respond, stop sending them. It's just a like a Facebook request - you pick your friends and your friends pick you.
The alternative is that all your communications will be deemed Business Spam and blocked at the front door. Do you really want all your customers to say your emails are BS (Business Spam)?
Business Spam (BS) is what I call the endless stream of chaff filling my inbox with sales and marketing fluff. If a colleague emails me about a cool new emerging technology, I'm happy. If a trusted business partner gives me a preview of a new product and offers me the opportunity to beta test it, I'm thrilled. If Bob at XYZ.com describes their cloud-based, software as service, offshore, outsourced, app store compliant product line that's compiled in powerpoint (i.e. does not yet exist except in sales and marketing materials), I press delete as fast as I can.
Since there are multiple domains that can be used to reach me - bidmc.harvard.edu, caregroup.harvard.edu, caregroup.org etc. many email list sellers vend 5 or 6 variations of my email address, resulting in 5 or 6 copies of each life changing offer in my inbox.
Now I know why some say email is dead. Email is a completely democratic medium. Anyone can email anyone. There are no ethical or common sense filters. The result is that Business Spam will soon outnumber my legitimate email.
Social networking architectures offer an alternative. I'm on Facebook, Twitter, LinkedIn, Plaxo etc. In those applications, individuals request access to me. Based on their relationships to my already trusted colleagues and my assessment of their character, I either allow or deny access. Once I "friend" them, appropriate communications can flow. If the dialog becomes burdensome or inappropriate, I can "block" them.
In order to stay relevant, email needs to incorporate social networking-like features. It should be easy to block individuals, companies, or domains that I do not want to hear from. Today, when a vendor ignores my pleas to remove me from their emailing list (demonstrating a lack of compliance with anti-spamming policies), I ask our email system administrator to blacklist their entire domain, preventing the flow of their Business Spam across the enterprise.
For those of you who use unsolicited business email as a marketing technique, beware. Your message is not only diluted by the sheer volume of companies generating Business Spam, but it also creates a negative impression among your recipients.
My advice - send your customers a newsletter describing your products and services. Ask them to opt in to receive future messages. If they do not respond, stop sending them. It's just a like a Facebook request - you pick your friends and your friends pick you.
The alternative is that all your communications will be deemed Business Spam and blocked at the front door. Do you really want all your customers to say your emails are BS (Business Spam)?
Monday 25 April 2011
Facebook's Green Data Center
In my roles as CIO at Harvard Medical School and Beth Israel Deaconess Medical Center, I oversee 4 data centers (one primary and one disaster recovery site for each institution). Over the past several years, I've not been challenged by data center real estate, I've been challenged by power and cooling demands.
My teams have invested substantial time and effort into enhancing our power usage effectiveness (PUE) - the ratio of total power consumption including cooling and transformer losses divided by how much of the power is actually used by computing equipment.
In the graphic above, BIDMC has achieved a PUE of 1.82, which is low compared to many corporations. We've done cold aisle containment, floor tile ventilation, and hot air recapture to reduce our Computer Room Air Conditioning (CRAC) needs substantially. We've matched the average of most green computing initiatives.
Despite all our efforts, we are limited by the constraints of the standard commercial hardware we run and the building we use.
Facebook has designed its own buildings and created its own servers via its Open Compute Project . Initial power usage effectiveness ratios are 1.07, compared with an average of 1.5 for their existing facilities.
Here's an overview of how they did it.
They've removed uninterruptible power supplies and centralized chilling units, which we cannot do because of architectural/engineering limitations of our building design. We're likely to achieve a PUE of 1.5, but could only achieve 1.07 by opening a new, fresh-built data center.
Here's a look at the kind of energy efficiency that cloud providers are achieving by creating dedicated mega data center buildings.
On April 28, I'm keynoting the Markley Group's annual meeting and you can be sure that I'll include power and cooling in my list of the things that keep me up at night.
Congratulations, Facebook!
My teams have invested substantial time and effort into enhancing our power usage effectiveness (PUE) - the ratio of total power consumption including cooling and transformer losses divided by how much of the power is actually used by computing equipment.
In the graphic above, BIDMC has achieved a PUE of 1.82, which is low compared to many corporations. We've done cold aisle containment, floor tile ventilation, and hot air recapture to reduce our Computer Room Air Conditioning (CRAC) needs substantially. We've matched the average of most green computing initiatives.
Despite all our efforts, we are limited by the constraints of the standard commercial hardware we run and the building we use.
Facebook has designed its own buildings and created its own servers via its Open Compute Project . Initial power usage effectiveness ratios are 1.07, compared with an average of 1.5 for their existing facilities.
Here's an overview of how they did it.
They've removed uninterruptible power supplies and centralized chilling units, which we cannot do because of architectural/engineering limitations of our building design. We're likely to achieve a PUE of 1.5, but could only achieve 1.07 by opening a new, fresh-built data center.
Here's a look at the kind of energy efficiency that cloud providers are achieving by creating dedicated mega data center buildings.
On April 28, I'm keynoting the Markley Group's annual meeting and you can be sure that I'll include power and cooling in my list of the things that keep me up at night.
Congratulations, Facebook!
Friday 22 April 2011
Cool Technology of the Week
I'm a great fan of creating networks of networks for healthcare information exchange. Point to point interoperability does not scale but creating local or regional collaborations that enable large numbers of organizations to connect with minimal interfacing works very well.
Today, Surescripts announced the Lab Interoperability Cooperative to connect hospital labs with public health agencies.
In Massachusetts, NEHEN has worked with the Boston Public Health Commission and the Massachusetts Department of Public Health to enable all the hospitals in Eastern Massachusetts to send reportable lab, syndromic surveillance, and immunization information by simply connecting HL7 2.5.1 transmissions to a single gateway.
Surescripts has the same plan but on a national scale. Hospitals interested in participating can register by completing the “Phase I Checklist” by April 29, 2011.
The project is funded by a grant from the Centers for Disease Control with participation from the American Hospital Association and the College of American Pathologists. During the two-year grant period, the project will recruit, educate and connect a minimum of 500 hospital labs to the appropriate public health agencies. At least 100 will be critical access or rural hospitals.
Based on the Surescripts Network for Clinical Interoperability, the project will support all federal and state policies and standards for health information exchange, including privacy and security standards.
A standards-based network to connect hospital labs and public health agencies. That's cool!
Today, Surescripts announced the Lab Interoperability Cooperative to connect hospital labs with public health agencies.
In Massachusetts, NEHEN has worked with the Boston Public Health Commission and the Massachusetts Department of Public Health to enable all the hospitals in Eastern Massachusetts to send reportable lab, syndromic surveillance, and immunization information by simply connecting HL7 2.5.1 transmissions to a single gateway.
Surescripts has the same plan but on a national scale. Hospitals interested in participating can register by completing the “Phase I Checklist” by April 29, 2011.
The project is funded by a grant from the Centers for Disease Control with participation from the American Hospital Association and the College of American Pathologists. During the two-year grant period, the project will recruit, educate and connect a minimum of 500 hospital labs to the appropriate public health agencies. At least 100 will be critical access or rural hospitals.
Based on the Surescripts Network for Clinical Interoperability, the project will support all federal and state policies and standards for health information exchange, including privacy and security standards.
A standards-based network to connect hospital labs and public health agencies. That's cool!
Thursday 21 April 2011
Upcoming Conferences
Spring is speaking season and here are two upcoming conferences of interest. I'll be moderating panels at both.
The first conference is "Enabling the Adoption of HIT to Transform Patient Care" on April 25, 2011 at the Harvard Club of Boston.
This conference features keynotes by Dr. David Blumenthal who will speak about his vision for modernizing health care delivery and Dr. David Bates who will discuss using health IT to improve patient safety. The conference will also feature two panels. The first will focus on supporting providers to achieve meaningful use of EHR. The second will focus on new and innovative technologies to engage patients and providers in care delivery.
The conference is the result of hard work by the HSPH Public Health & Technology (PHAT) Forum, a graduate student organization whose mission is to provide an interactive, cross-disciplinary forum for exploration and innovation at the intersection of health, information, and technology.
The second conference is the Governor's Health IT Conference hosted by Deval Patrick at the DCU Center, Worcester, MA May 9-10.
Keynotes include Deval Patrick, Dr. David Blumenthal, and Sachin Jain, MD, MBA, Special Assistant to the National Coordinator. Topics include:
*How the Office of the National Coordinator will fund the deployment of electronic health records and the exchange of data among these systems
*Governor Patrick's proposal for transforming the healthcare payment system
Medicare and Medicaid initiatives for quality improvement and shared savings
*The contributions that health IT will make to clinical quality, patient-centeredness, and the economic recovery in Massachusetts
See you at these conferences!
The first conference is "Enabling the Adoption of HIT to Transform Patient Care" on April 25, 2011 at the Harvard Club of Boston.
This conference features keynotes by Dr. David Blumenthal who will speak about his vision for modernizing health care delivery and Dr. David Bates who will discuss using health IT to improve patient safety. The conference will also feature two panels. The first will focus on supporting providers to achieve meaningful use of EHR. The second will focus on new and innovative technologies to engage patients and providers in care delivery.
The conference is the result of hard work by the HSPH Public Health & Technology (PHAT) Forum, a graduate student organization whose mission is to provide an interactive, cross-disciplinary forum for exploration and innovation at the intersection of health, information, and technology.
The second conference is the Governor's Health IT Conference hosted by Deval Patrick at the DCU Center, Worcester, MA May 9-10.
Keynotes include Deval Patrick, Dr. David Blumenthal, and Sachin Jain, MD, MBA, Special Assistant to the National Coordinator. Topics include:
*How the Office of the National Coordinator will fund the deployment of electronic health records and the exchange of data among these systems
*Governor Patrick's proposal for transforming the healthcare payment system
Medicare and Medicaid initiatives for quality improvement and shared savings
*The contributions that health IT will make to clinical quality, patient-centeredness, and the economic recovery in Massachusetts
See you at these conferences!
Wednesday 20 April 2011
The April HIT Standards Committee meeting
The April HIT Standards committee included important discussions about the timeline for the work ahead, how we'll organize to do that work, and how the HIT Standards Committee will interact with the HIT Policy Committee and the S&I Framework to ensure ONC has the final certification and standards criteria needed for meaningful use stage 2.
The meeting began with an overview by Farzad Mostashari, the new National Coordinator. He described the perfect storm of events we have in healthcare now - Meaningful Use Stage 2 from the HITECH Act, Accountable Care Organizations from the Affordable Care Act, and the Partnership for Patients: Better Care, Lower Cost from HHS.
Paul Tang then reviewed the likely characteristics of Stage 2. He summarized the public comment on stage 2 noting strong support for eRx of discharge medications, electronic progress notes, electronic medication administration records, secure messaging, and recording patient preferences for communications. He noted mixed support for other initiatives such as the list of care team members and longitudinal care plans. He highlighted the concerns about timelines for stage 2, especially the lead time required to create and widely implement new functionality. Options include reducing the reporting period for Stage 2 from 1 year to 90 days resulting in a 9 month delay, deferring stage 2 for a year, and splitting Stage 2 into 2a for increased thresholds on existing measures and 2b for introducing new technology. We discussed the need for detailed descriptions of the new stage 2 functionality so that we can determine where new standards are needed. The committee had a robust discussion about the need to work on certification criteria and testing criteria for stage 2, since the committee's stage 1 input was limited to standards alone.
Doug Fridsma then discussed the process we'll use this Summer to complete the necessary standards work in support of MU Stage 2. Steve Posnak provided a visual representation of the interactions among the HIT Policy Committee, Standards Committee and ONC described in the HITECH Act.
Doug outlined a very practical idea - as the HIT Standards Committee studies the standards to support stage 2, it should divide them into buckets as follows
A) Those meaningful use criteria for which no standards are needed - they are process goals
B) Those meaningful use criteria for which one existing standard exists and is a perfect fit
C) Those meaningful use criteria for which standards exist, but they are imperfect and need work
D) Those meaningful use criteria for which no mature standards exist
In the next few weeks, the Standards Committee will develop a detailed workplan for May and June by placing meaningful use stage 2 priorities into these buckets and then assigning them to ad hoc workgroups ("power teams") to rapidly analyze. We'll handoff those in buckets C and D to the S&I framework for further deliberation. The S&I groups will return their finished work to the Standards Committee for review and final polish.
Jamie Ferguson provided an update on the vocabulary task force, noting its efforts to specify one major vocabulary for each domain area (labs, problems, allergies, medications etc.), identify vocabulary subsets/codesets that will accelerate interoperability, and ensuring mappings between vocabularies are available where necessary.
Paul Egerman presented an overview of the PCAST Workgroup recommendations. Doug Fridsma discussed an analysis of existing metadata standards prepared by Mitre.
Paul's work as facilitator of the PCAST Workgroup effort was truly remarkable and his energy resulted in a comprehensive report that suggests a very reasonable path forward. One early pilot we discussed was the use of the Universal Exchange Language to send data from EHRs to PHRs. An existing CCR or CCD could be wrapped with patient identification and provenance (who generated the data in what setting) metadata and incorporated into PHRs, assuming existing PHR vendors can be convinced to accept a universal format.
Jim Walker provided an overview of the Clinical Quality Workgroup including their plan for supporting meaningful use stage 2 measures.
Dixie Baker and Walter Suarez presented their plan for Provider Directory work. Upcoming testimony will enable them to make recommendations at the May Standards Committee meeting.
A great meeting. As next step, we'll set up a group to evaluate the metadata possibilities for a PCAST pilot. We'll complete the workplace for the Summer and begin assigning that work. The next few months will be a sprint as we complete all the work needed to support the next level of Meaningful Use.
The meeting began with an overview by Farzad Mostashari, the new National Coordinator. He described the perfect storm of events we have in healthcare now - Meaningful Use Stage 2 from the HITECH Act, Accountable Care Organizations from the Affordable Care Act, and the Partnership for Patients: Better Care, Lower Cost from HHS.
Paul Tang then reviewed the likely characteristics of Stage 2. He summarized the public comment on stage 2 noting strong support for eRx of discharge medications, electronic progress notes, electronic medication administration records, secure messaging, and recording patient preferences for communications. He noted mixed support for other initiatives such as the list of care team members and longitudinal care plans. He highlighted the concerns about timelines for stage 2, especially the lead time required to create and widely implement new functionality. Options include reducing the reporting period for Stage 2 from 1 year to 90 days resulting in a 9 month delay, deferring stage 2 for a year, and splitting Stage 2 into 2a for increased thresholds on existing measures and 2b for introducing new technology. We discussed the need for detailed descriptions of the new stage 2 functionality so that we can determine where new standards are needed. The committee had a robust discussion about the need to work on certification criteria and testing criteria for stage 2, since the committee's stage 1 input was limited to standards alone.
Doug Fridsma then discussed the process we'll use this Summer to complete the necessary standards work in support of MU Stage 2. Steve Posnak provided a visual representation of the interactions among the HIT Policy Committee, Standards Committee and ONC described in the HITECH Act.
Doug outlined a very practical idea - as the HIT Standards Committee studies the standards to support stage 2, it should divide them into buckets as follows
A) Those meaningful use criteria for which no standards are needed - they are process goals
B) Those meaningful use criteria for which one existing standard exists and is a perfect fit
C) Those meaningful use criteria for which standards exist, but they are imperfect and need work
D) Those meaningful use criteria for which no mature standards exist
In the next few weeks, the Standards Committee will develop a detailed workplan for May and June by placing meaningful use stage 2 priorities into these buckets and then assigning them to ad hoc workgroups ("power teams") to rapidly analyze. We'll handoff those in buckets C and D to the S&I framework for further deliberation. The S&I groups will return their finished work to the Standards Committee for review and final polish.
Jamie Ferguson provided an update on the vocabulary task force, noting its efforts to specify one major vocabulary for each domain area (labs, problems, allergies, medications etc.), identify vocabulary subsets/codesets that will accelerate interoperability, and ensuring mappings between vocabularies are available where necessary.
Paul Egerman presented an overview of the PCAST Workgroup recommendations. Doug Fridsma discussed an analysis of existing metadata standards prepared by Mitre.
Paul's work as facilitator of the PCAST Workgroup effort was truly remarkable and his energy resulted in a comprehensive report that suggests a very reasonable path forward. One early pilot we discussed was the use of the Universal Exchange Language to send data from EHRs to PHRs. An existing CCR or CCD could be wrapped with patient identification and provenance (who generated the data in what setting) metadata and incorporated into PHRs, assuming existing PHR vendors can be convinced to accept a universal format.
Jim Walker provided an overview of the Clinical Quality Workgroup including their plan for supporting meaningful use stage 2 measures.
Dixie Baker and Walter Suarez presented their plan for Provider Directory work. Upcoming testimony will enable them to make recommendations at the May Standards Committee meeting.
A great meeting. As next step, we'll set up a group to evaluate the metadata possibilities for a PCAST pilot. We'll complete the workplace for the Summer and begin assigning that work. The next few months will be a sprint as we complete all the work needed to support the next level of Meaningful Use.
Tuesday 19 April 2011
Mobile Applications for Medical Education
Every year in April, we survey the HMS medical students about their use of mobile devices.
At HMS, we encourage students to buy the device of their choice - iPhone/iPod/Ipad, Android, Blackberry, Kindle etc. We then support these devices with software licenses and controlled hosted applications.
Our Mycourses Learning Management System has a Mobile Applications tab. Under General Resources, we offer a mobile version of all course content via connected devices (WiFi, 3G etc.). We also offer a Kindle version for downloading course content to the device.
On our Mobile Resources page, we offer downloads of many popular applications. Most include native iPad support.
What are the most popular in 2011?
Dynamed - a clinical reference tool created by physicians for physicians and other health care professionals for use primarily at the 'point-of-care' .
Unbound Medicine uCentral - a collection of popular titles including 5 Minute Clinical Consult, A to Z Drug Facts, Drug Interaction Facts (an interaction checker), Review of Natural Products,Medline Table of Contents Alerts, and Medline Auto Alerts.
VisualDx Mobile - a visual decision support tool. VisualDx merges medical images with a problem-oriented findings-based search.
Epocrates Essentials - an all-in-one mobile guide to drugs, diseases, and diagnostics which includes Epocrates Rx Pro, Epocrates SxDx, and Epocrates Lab.
iRadiology - a compendium of over 500 unique images demonstrating classic radiological findings.
I'll post the complete survey for 2011 soon.
At HMS, we encourage students to buy the device of their choice - iPhone/iPod/Ipad, Android, Blackberry, Kindle etc. We then support these devices with software licenses and controlled hosted applications.
Our Mycourses Learning Management System has a Mobile Applications tab. Under General Resources, we offer a mobile version of all course content via connected devices (WiFi, 3G etc.). We also offer a Kindle version for downloading course content to the device.
On our Mobile Resources page, we offer downloads of many popular applications. Most include native iPad support.
What are the most popular in 2011?
Dynamed - a clinical reference tool created by physicians for physicians and other health care professionals for use primarily at the 'point-of-care' .
Unbound Medicine uCentral - a collection of popular titles including 5 Minute Clinical Consult, A to Z Drug Facts, Drug Interaction Facts (an interaction checker), Review of Natural Products,Medline Table of Contents Alerts, and Medline Auto Alerts.
VisualDx Mobile - a visual decision support tool. VisualDx merges medical images with a problem-oriented findings-based search.
Epocrates Essentials - an all-in-one mobile guide to drugs, diseases, and diagnostics which includes Epocrates Rx Pro, Epocrates SxDx, and Epocrates Lab.
iRadiology - a compendium of over 500 unique images demonstrating classic radiological findings.
I'll post the complete survey for 2011 soon.
Subscribe to:
Posts (Atom)