I have been recommending this system to people for over a year now after first discovering it. It probably single handedly is one of the best startups to invest in.
Just months before hearing it, my pregnant wife and I were researching how hard ultrasound devices are to build, and were very surprised at how ultrasound can be used to detect/diagnosis an insane amount of common illnesses.
The hardware and physics behind ultrasound isn't nearly as technical or as advanced as you may think, and nowadays it is mostly a software issue. Combine the imaging with personal baselining and machine learning across similar body types fitted to 3D human model templates, you'd get Star Trek's medical scanner.
The technology here could be used to eliminate 99%+ waste and inefficiency in the medical industry, effectively bankrupting the current institutions and disrupting the industry with better, cheaper, healthier, less-annoying care. This is a future we should be fighting for.
Very excited for this, I try to use bedside US daily in my hospitalist work but the portable US in my hospital is on a heavy cart and having to use elevator vs stairs makes it not worth it.
Also, the article makes the common typo writing HIPPA instead of HIPAA. Otherwise great article.
Tangential but interesting: conventional spell checks, in my experience, tend to turn off for all caps words, or words that start with a capital. Turning it on always can be annoying, so you need something smart, which detects common errors, but doesn't hose you with false positives.
Tangential to this tangent: Google Docs spell check catches wrong use of "flower" vs "flour" based on context. Try this: open a new doc, type:
They probably use a bloom filter and populate it with a bunch of 'probably spurious' word pairs. The combination of 'Flower' and 'eggs' will evaluate as 'probably spurious' but the combination 'flower' and 'sugar' evaluates as 'not spurious'. This is probably a manually populated bloom filter of common spurious word pairs.
For reference, a bloom filter is an extremely space efficient, probabilistic data structure that acts a bit like a set and can answer the query 'does the bloom filter contain this entry'. The bloom filter will respond with either 'definitely not' or 'possibly/probably' depending on how it is tuned.
You could conceivably automatically populate this (still hardcoded) bloom filter by doing a brute force language corpus search for heavily correlated word pairs that have one or more of the two words having phonetically similar misspellings. E.g. 'sea' and 'breeze' would be heavily correlated. 'Sea' has a phonetically identical misspelling 'see'. You could then automatically add 'see + breeze' as a spurious pairing to the filter.
I think Google has pretty good deep learning-based word prediction for their Swype-style Android keyboard.
There was a period, maybe 1.5 years ago, during which text input prediction got substantially worse, then gradually improved. Along with the change came the ability for text input to change the estimated word after you entered the next word, using the combination of your entries to both words to estimate both simultaneously.
If they have language models which perform that task at a level worth pushing out to consumers they can do some smoothing of entries in a list.
I think you are overestimating how big a difference a bedside US will make in my hands. If I think I'm going to change management on a pt based on US I'm ordering it done right by the right person. The usefulness of bedside US is still in its childhood, most of the time I'm using it to learn or confirm what I already know. Eventually I hope we get to a point where it's a standard of care that makes a large difference in pts.
This comment breaks the site guidelines, which ask you to assume good faith and please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize.
One-time purchase cost of Lumify is closer to $8,000 USD or a $199/mo subscription (which leaves you never owning the transducer).
At $2,000 this is a lot cheaper which will hopefully drive wider deployment.
Even in more developed countries, wider-scale use of imaging would be amazing: primary care being able to directly use ultrasound without referral to a specialist, and with smart image analysis, could cut diagnosis times significantly. In my case I waited 3 weeks for an US appointment on the NHS and that could have been avoided entirely.
Well, point of care ultrasound (which is what the butterfly is for) can often lead to formal ultrasonography to be read by a radiologist. However, it can increase time to initial treatment, and depending on the level of skill of the user can sometimes prevent the need for formal ultrasound, but not always.
They are also lowering the bar for knowledge required to successfully operate the probe:
"Computer vision algorithms ingest footage from the handset’s camera and detect the probe’s location in real time, directing users through augmented reality (AR) prompts precisely where to position it. (Butterfly calls it “Tele-Guidance”.)"
What can it and cannot detect compared to MRI, CT and xray?
I know it's used for pregnancy, never saw other contexts for US so far, so would of course love to know if this will have more applications more easily available!
> The cloud storage service to which images are uploaded is AES 256-bit encrypted and SOC II certified
Um... Encrypting with aes 256 is easy. But will the keys be properly handled?
It's useful in a ton of situations, here are just a few examples:
In a cardiac arrest patient, it's useful to know if the heart is actually moving or not (i.e. you may be seeing electrical signals on an EKG, but the heart isn't physically responding). It's also useful to look for a buildup of fluid around the heart, called cardiac tamponade (which squeezes the heart and prevents it from working effectively).
In trauma patients, a "focused assessment with sonography for trauma" or "FAST scan" is a quick way to check the common sources of major internal bleeding (which can provide useful guidance for treatment decisions in a very time sensitive setting).
It's also useful for "routine" stuff like the placement of IVs. Ultrasound can be used to positively identify and locate veins that are otherwise difficult to feel through the skin.
Just months before hearing it, my pregnant wife and I were researching how hard ultrasound devices are to build, and were very surprised at how ultrasound can be used to detect/diagnosis an insane amount of common illnesses.
The hardware and physics behind ultrasound isn't nearly as technical or as advanced as you may think, and nowadays it is mostly a software issue. Combine the imaging with personal baselining and machine learning across similar body types fitted to 3D human model templates, you'd get Star Trek's medical scanner.
The technology here could be used to eliminate 99%+ waste and inefficiency in the medical industry, effectively bankrupting the current institutions and disrupting the industry with better, cheaper, healthier, less-annoying care. This is a future we should be fighting for.