Many organizations – public and private – use formal competitive bidding instruments to procure technical products and services. This may be done by law, policy or practice. The driving reasons are to help ensure both clear understanding of the organization’s requirements and expectations, as well as fairness in the marketplace to all potential respondents who wish to do business with the organization. HLN’s new white paper, Effective Technical RFP Development: A Guide for Jurisdictions and Other Organizations, offers some practical advice for organizations issuing competitive solicitations.
By Noam H. Arzt and Michael Berry
It is common practice today to encrypt data at rest, that is, data stored on servers. To build off an old adage, no one ever got fired for encrypting their data. But what protection does that really provide? Is just encrypting data enough?
First, let’s distinguish between three methods for encrypting data at rest.
HLN’s president, Dr. Noam Arzt, participated in a regional meeting in Hawai’i sponsored by the American Immunization Registry Association (AIRA). Bringing together Immunization Information Systems managers and staff from the Pacific Islands, this two-day session focused on the unique needs of these island jurisdictions with respect to IIS functionality and interoperability. Dr. Arzt participated as a panelist in a session on successful vendor-client relationships.
Most public health information technology projects rely on strong collaboration to be successful, especially across vendor-client boundaries. Here are some successful strategies:
- Clear vision. A concise and clear vision focused on public health outcomes is embraced and articulated by all participants in the project.
- Strong support and leadership from senior management. Without strong support from senior management, projects are rarely given the priority to enable success. This prioritization includes both agency and vendor commitment.
- Funding. Both external (Federal) and internal (state/local) funding need to be committed to enable success, though long-term sustainability is an ongoing issue.
In an earlier post I wondered whether public health’s siloed systems might not be more appropriately thought of as siloed data. But after attend a meeting of the Joint Public Health Informatics Taskforce (JPHIT) I am wondering whether the issue is really siloed workflow.
In public health, data is used to support specific programs, and systems develop to provide a means to collect, analyze, and disseminate this data. Individuals in the programs define the data sets and create systems that support specific protocols and activities that are considered unique to the program area. This is often the result of increased specialization in both the clinical and epidemiological practice and can result in processes that are at their core quite similar being described in diverse ways. Data definitions, codes and terminology sets often also evolve in a divergent way when often they are describing the same qualities or attributes, often about the same patients, conditions, or environment.
Public health agencies need to focus on the commonalities across their programs rather than on the differences. Existing and emerging standards activities should help promote a convergence of systems, data, and workflow to increase interoperability, reduce redundancy, and promote sharable, reusable, cheaper system components. As collaboration among programs and agencies moves some implementations to shared solutions or cloud-based implementations, public health needs to be careful not to create a set of siloed platforms that provide parallel, non-interoperating services to the same agencies.