At DocSpot, our mission is to connect people with the right health care by helping them navigate publicly available information. We believe the first step of that mission is to help connect people with an appropriate medical provider, and we look forward to helping people navigate other aspects of their care as the opportunities arise. We are just at the start of that mission, so we hope you will come back often to see how things are developing.
An underlying philosophy of our work is that right care means different things to different people. We also recognize that doctors are multidimensional people. So, instead of trying to determine which doctors are "better" than others, we offer a variety of filter options that individuals can apply to more quickly discover providers that fit their needs.Got questions?
As you know, our primary focus at DocSpot has been to connect you with individual health care providers. This week, I had hoped to unveil a new service that would allow you to search for hospitals, but the final touch-ups have taken me longer than I expected. Sometimes the smallest segments of a product can take the longest amount of time. Such is the nature of development.
In this case, I discovered that one of our sources of data was not as tidy as we had thought. Since we deal with publicly available data, we don't expect everything to be nicely sorted and packaged for us. That's what our specialized "robots" are for. However, there are certain times when the data proves to be incorrigible, and we must either reject it as a primary source or dispose of it altogether.
I had relatively high expectations for Medicare's "Providers of Service" list; albeit publicly available, it is not free. And at first glance, it seemed polished and straightforward to integrate. Then when I ran some diagnostics, I met with the worst nightmare of any engineer tasked with data management: duplicates. Multiple hospitals with the same address and same name - but different data. I had no idea which profile was correct, and the data's documentation didn't give me any indication of how to resolve the issue, let alone mention possible redundancies.
So, as engineers are wont to do, I started looking for patterns. I found a reference number that might link one duplicate to the next, a date which seemed to indicate when the profile was last updated, a code that suggested a hospital had been shut down, a category that appeared to single out duplicate entries. In the end, the relationships seemed too arbitrary, and I hadn't even rooted out all the redundancies. One pair, in particular - two profiles for Broughton Hospital, in North Carolina - deigned to mock my efforts: differing by only one or two data points, they matched on every single metric I used to differentiate between duplicate profiles.
After almost giving up on this rich source of data, I finally discovered another Medicare file (on a completely different section of their website) that identifies the unique entries in the problematic source. Problem solved. The question remains - will there be yet another set of finishing touches? Time will tell - such is the nature of product development. In the meanwhile, keep checking our blog for updates, and let us know what you would like to see in our upcoming hospital product.
A Wall Street Journal article reported on the Department of Health and Human Services' requirement that health insurance companies report details of each policy that they underwrite. A key element of this requirement is that insurance companies must report the details via a standardized form, which allows for more meaningful side-by-side comparisons. At last, increased transparency coming soon to the health insurance market (supposedly starting March 2011).
Having bought health insurance on the individual market, I look forward to the increased access to this information. Who doesn't like the idea? Not surprisingly, insurers were said to be "concerned about the potential cost and administrative burden of the new requirement." That has a familiar ring to it -- didn't mobile phone operators say the same thing when regulators required phone number portability?
With increased access to information like this and other tools in adjacent spaces, hopefully patients will be able to make better decisions about their health. That's what we're working towards.
In a prior blog post, I referenced the balance between privacy and transparency in our quest to empower patients to make better decisions. The discussion around patient reviews seemed sufficiently complicated that I decided to address that separately. So, picking back up on the topic, the question is whether we would like to allow providers to hide certain reviews.
For us, there's actually not much of an issue if we knew that all reviews were true (and objective). If that were the case, the answer would essentially be the same answer to the question of whether or not to display something like disciplinary actions: yes, we can understand why providers might want that hidden, but no, we wouldn't want to hide that. The question becomes more complicated in light of allegations that have surfaced that people write false information in their reviews (whether that be one provider writing a false review for a competitor or whether that be an unhappy patient making up facts to strengthen his case).
It'd be interesting to read a study on the estimated number of patently false reviews (although, I'm sure that such a study would be expensive and difficult to pull together). Overall, patients very much look to see what other patient say -- for a while, that was the most requested feature. The sense that we got in talking with people is that patient reviews needed to be taken with a grain of salt, much as reviews for other products and services. While it might be easy to fake one or two reviews, it seems unlikely that someone would go through the trouble of creating ten different reviews for the same provider. Impossible? Nope. Improbable? We think so. And that brings us to our take on whether or not to show patient reviews: a few reviews is not a lot of signal, but the aggregation of many reviews that voice the same feedback over a long period of time is likely to mean something. Interestingly, someone just wrote in with the same opinion.
And for those who are curious, we try to offer something extra when people leave a review on our site. After typing in their feedback, we give users the opportunity to upload some documentation (e.g. an explanation of benefits form or a receipt) and request a verification. If everything checks out, we'll mark that review as a "Documented Encounter" -- it's a little like an online retailer saying that a review is from a "Verified Buyer" since they know the online identity of the individual and as well as his purchase history. Have thoughts on this topic? Let us know.
Product development never seems to go quite as fast as we would like, and we've ended up with an impressive stack of front-end web development tasks still to be done. Thus, that's the next area that we would like to hire for. Know anyone who might be interested? Here's the scoop.
In case you're wondering, we value people's enthusiasm and their ability to learn far more than we value prior experience. For this role, we're happy to train the right person. Are you someone who yearns to make a difference in the world of health care? Let us know!