I'll need to dig up a source but I recently heard about this company and, apparently, before offering gigs they do a credit report to determine how much debt the person is carrying (i.e. how desperate they are) and they use that information to _round down_ the hourly rate they offer them.
In the unlikely event that there are any negative consequences for this breach, they deserve every bit of them and more.
I don't remember the source, but I believe I listened to a podcast on an "uber for nurses" (not sure if it was this place), but they do all sorts of nasty things that really shaft the nurses. ISTR that the nurses when they get called in, have to be running a phone app that tracks them, and if they get stuck in traffic or lose cell signal, they get demerits. They pretty much do anything they can to give the nurses a demerit, and demerits cause your pay to go down.
So they're pretty much taking the existing terrible nursing environment in healthcare, and weaponizing it. Nurses already have too many patients and not enough CNAs, on top of 12 hour shifts, needing to do charting after those 12 hours. Healthcare squeezes nurses to the breaking point. Data point: my wife is a nurse.
Well yes, but more so it's how I expect a shitty and perversely structured industry that makes boatloads of money perpetuating a variety of huge barriers to entry to treat the employees who have the least barriers to entry protecting them.
I think I heard the same Podcast - not only do the Apps try and discover the minimum rate a Nurse might take, they’ll actively attempt to manipulate the circumstances of Nurses who were in a strong position so they too end up more dependent and exploitable.
Thanks. This is definitely the source I was referring to.
However, as it applies to my parent comment, the companies mentioned were: Shiftkey, Shiftmed and Carerev. I do not see ENSHYFT mentioned, so I stand corrected.
What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
But when it comes to private markets and semi-private negotiations that same sentiment doesn't easily transfer. Does society benefit in some unique way for allowing asymmetries in labor negotiations, private markets like Uber, or B2C relations like Robinhood (1,2)?
1. https://www.sec.gov/newsroom/press-releases/2020-321
2. Note, Robinhood was fined not for front-runniny customers, just for falsely claiming customers received quality orders. I suspect theyve only stopped the latter behavior.
> broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong
I don't think that's true at all. Companies and individuals negotiate all the time with information the other party doesn't have. Insider trading is about fairness on public markets so every negotiating party of the same type has the same information, and is quite specific to that.
Incentive wise you're probably a lot better off if your own broker is front running you than if a HFT desk at a liquidity provider firm is doing it since the broker is at least in a position to kick some of that back to you in the form of reduced fees or whatever.
It's definitely shady, but it's par for the course. Uber charges you more if you have more gift cards loaded, or just spend more on average in general. You charge what the market will bear.
No, it just hasn’t been possible to differentiate as well before.
One example is biscuit manufacturing, where it’s a fairly open secret that supermarket own brand biscuits are the same product as name brand, because it’s better to capture that segment at a lower margin than to lose it to competition.
Tech now makes it possible to target individuals rather than demographics, but there’s nothing inherently against the status quo in doing so.
There's no such thing as "the market", there are market segments that abstractly represent groups of people with similar characteristics. Charging different prices to people in different segments is standard business practice. Burger chains could charge wealthy individuals $100k per burger if they wanted to, just, burger chains usually have difficulty distinguishing the truly wealthy individuals who walk in the door who would have no trouble putting down that kind of money for a burger.
.... which, in the day and age of facial recognition, gives me an idea for a startup.
Burger chains have at least gotten a start on differentiating their pricing - by raising prices dramatically across the board, and telling anyone who’s frugal or just broke that they can only get discounts (to bring prices slightly lower than today’s pricing, but still a lot more than before) if they use the app. Upper-class people don’t bother with it and pay full price, frugal people take the time to figure out the cheapest way to use one of the current “offers” to assemble a meal.
The market is an agglomeration of many individuals, meaning that there is no hard and fast rule that you must charge only one price for the entire market; indeed, many custom-priced products exist, enterprise SaaS being one example.
Pieces of shit. And then they assign you a score for each travel, as if you are really "carpooling" when in reality is a shitty taxi replacement (not that taxis are on a moral high ground, but the point still stands).
We don't need names, we need legislature, and we need to vote for people who will write it, as opposed to grifters who only seek to pad the pockets of billionaires.
These predators aren't scared of name and shame. Any publicity is good publicity (And if it actually gets bad, they'll sue the pants off you.). They are scared shitless of laws censuring their behavior. It's why they fight like mad to ensure that they aren't subject to them.
I’m interested, given the massive nursing shortages, why any nurses were using this service at all? Especially for higher levels, there’s no reason to mess with a shitty app that underpays you, when you should be able to walk into any provider’s office or facility and get hired almost immediately (and for Runs, you even have wide-ranging telehealth options).
This was my thought exactly. There is a giant nursing shortage. I know some nurses who are traveling nurses and they may bank, and they don't need any BS app. (Just want to emphasize, nursing is an incredibly difficult job at the moment, but there are also currently weird dynamics where traveling nurses can actually make a lot more than "stationary" nurses).
Thus, I'm led to believe that nurses using this app have to have some sort of difficulty finding jobs for other reasons, or they're just not informed about their options.
I imagine many of them are people who can't commit to full or even part-time jobs because of responsibilities like childcare or eldercare; their own physical or mental health issues; etc.
At scale, the corner cases don't really matter. In aggregate, if it's decently well correlated and readily available, it's probably going to be used.
I can't find it now, but I believe LexisNexis or another large similar reporting/data agency had a product catalog of dozens of products that spit out values for ability to pay, disposable income monthly, annual income, etc.
It makes you feel awful thinking about the direction things are headed. Corporations approaching omniscient regarding all facts of our lives that are reasonably of value to them.
In the section of their Privacy Policy titled Data Security [0]:
> We use certain physical, managerial, and technical safeguards that are designed to improve the integrity and security of information that we collect and maintain. Please be aware that no security measures are perfect or impenetrable. We cannot and do not guarantee that information about you will not be accessed, viewed, disclosed, altered, or destroyed by breach of any of our physical, technical, or
managerial safeguards. In particular, the Service is NOT designed to store or secure information that could be deemed to be Protected Health Information as defined by the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”).
IANAL and all that, but I’m not sure you can use the excuse “We didn’t design our system to be HIPAA compliant, sorry,” and hope your liability disappears. Does anyone know?
> I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
It looks like providers accidentally uploaded some PHI.
IANAL so may be wrong, but I worked for a healthcare company. Whether HIPAA applies to them depends on if they are considered a covered entity or a business associate [0].
IMO they aren't bound to HIPAA requirements as a covered entity.
Business associate is a little tricky to determine. But business associates have to sign a BAA (Business Associate Agreement). And I doubt they would have signed one if they have that in their privacy policy.
Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant..
> Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant.
You seem to imply using GMail is a bad thing? I think GMail, when appropriately configured to handle PHI, is probably a million times more secure than some crappy bespoke "enterprise" app.
It isn't that hard to setup a secure SFTP server to automate the exchange. But then again this is a post about configuring a S3 Bucket with public access for SSNs.
The issue with Gmail is sending to the wrong email, sending to a broad email list, having people download it to their local machines. And the amount of PHI being transmitted in these files is larger than this s3 bucket.
>It isn't that hard to setup a secure SFTP server to automate the exchange
When you've got a trickle of information coming and going from hundreds or thousands of other individuals working at tens or hundreds of other entities it is.
You'd eventually wind up developing the kind of ridiculous "secure messaging and file drop" type service that every megabank builds on top of their SFTP and ticketing systems for that purpose. That stuff ain't cheap to run and keep running.
Better to just start with a solution that's 99% there.
HIPAA only applies to a very specific entity called a "covered entity". At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
That being said, HIPAA isn't even relevant here because "ESHYFT" is just a provider a labor. No different than a big consultant providing staff augmentation services.
> At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
Again, HIPAA continues to be the most colloquially misunderstood law out there.
The rule that makes providers "covered entities" isn't really about insurance, it's about whether they transmit specific HIPAA "transactions" electronically. Now, yes, most of these transactions having to do with providers are thing like claim submissions or pre-authorizations to insurance. But there are other reasons a provider may need/want to send a HIPAA transaction electronically.
My point is that there isn't some sort of "loophole" where providers that don't accept insurance are somehow being sneaky. The whole point of the HIPAA security rule is to protect PHI when it is transferred around to different entities in the healthcare system. If the information is going just between you and your doctor, HIPAA isn't relevant, and that is by design.
> it's about whether they transmit specific HIPAA "transactions" electronically.
That's correct, but if you don't accept insurance then you will not transmit anything that meets the criteria to be covered by HIPAA. At least, in terms of being a provider. Things are different if you're a health plan or clearing house.
I spent a lot of time and money questioning this with lawyers at a health tech startup I previously worked at. The underlying reality is nearly the entire US healthcare system falls under HIPAA because nearly everyone wants to accept insurance. However, if you're a doctor running a cash-only business you will not be a covered entity, even if you send PHI electronically.
HIPAA doesn't care about your POS TOS. It either applies or does not.
That said, it's both less broad and more toothless than I'd like. If FB convinces you to install a tracking pixel (like button) stealing your private medical data, they likely haven't violated any laws. At most you'd be able to file a claim against the person who created the leak.
Not a lawyer and all that, but for TFA I don't think HIPAA would be a valid way to try to limit your losses. It's a bit closer to what would happen if you (a doctor) uploaded patient data to Google Drive and then somehow leaked that information (one of Google's contractors disclosing it, a hack, whatever). Nothing about ESHYFT's offerings requires or would be benefited by the data HIPAA protects, and (ignoring incompetence and other factors) I'd be as surprised to see my health data leaked there as I would to see a YT video going over my last lab reports because of some hospital's actions.
They could still be liable for all sorts of other damages (and maybe somebody can convince a court of a HIPAA violation), but it's not an easy HIPAA win.
If you partner with a healthcare provider to provide any sort of technical services, you will be required to sign a BAA (Business Associates Agreement), which makes you similarly liable to the HIPAA & HITECH acts.
>With persons or organizations (e.g., janitorial service or electrician) whose functions or services do not involve the use or disclosure of protected health information, and where any access to protected health information by such persons would be incidental, if at all.
Based on the context from the article of the PHI uploaded being incidental, it would probably fall under this exception. It sounds like ESHYFT isn't meant to be storing any PHI based on the privacy policy above.
The PII of the nurses being accidentally shared by a staffing agency isn't a HIPAA violation. Yes the nurses are providers but their relationship with the Uber for nurses service isn't a medical provider relationship. It's definitely a legal and ethical failing but I don't think it's a HIPAA one.
This is what I took away from the reading. It's basically a shift/employee management platform. The only reason we're even discussing HIPAA is because health care industry adjacent.
If you replaced nurses with gig workers and uber for nurses with something like WeWork this would just be like every other leak we talk about on HN.
HIPAA avoidance is much narrower than that. Entities which perform administrative or managerial duties on behalf of a mandated organization that have to transmit PII to provide that service are also covered, even if the entity itself isn't a provider.
If 'Uber for nurses' is acting on behalf of nurses, it probably doesn't apply? If it's acting on behalf of the hospitals (who are indisputably covered entities), then the situation is much less clear.
I encountered a similar situation with my startup many years ago and decided "better safe than sorry" after consulting the lawyer.
I used to work in the field. HIPAA protects patient data, not provider data. If my understanding is correct that only nurse PII was leaked, this has nothing to do with HIPAA.
In general, I've found that people tend to think HIPAA applies much, much more than it actually does. Like people thinking if you're in a meeting at work with clients and say "Sorry, Bob couldn't be here today, he's got the flu" that that's a HIPAA violation. No, it's not.
This is just an employee data leak, just like a bajillion other employee data leaks. The fact that the employees happen to be nurses still doesn't mean it has anything to do with HIPAA.
ESHYFT isn't a covered entity, so HIPAA doesn't apply to them. Even if they have health data of their employees in their system, they're still not a covered entity.
Really, "Uber for Nurses" is a title to drum up interest. "Large Staffing Service" would be factually accurate.
>I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
The title is exaggerating what the article says and the article is making a big stretch about this being possibly HIPAA covered, I stand corrected, this has nothing to do with HIPAA.
What was leaked was nurses' doctors notes submitted justifying calling out of work. Still a serious leak but nowhere near what is being suggested.
I'm confused because the article lays it out by the 4th paragraph, and you have the right understanding, up until "we're a startup"
Maybe you think the startup maintains patient records?
The article lays out the nurses uploaded them, the provider. This is a temp booking system. The health records were uploaded by the nurses to communicate reasons for absences to their employee and weren't required or requested
They have as much responsibility as Dropbox does. Nurses shouldn't have uploaded them.
Worth mentioning, because the authority level of medical practitioners throws people off. Don't ever give a doctor or practice your Social Security Number. They don't need it. Similarly if they want to check an ID that doesn't mean scan or photograph. Doctors, practices, etc are the worst at infosec. They have no training, basically no penalties if they do something wrong and all of that info is only to follow up in case you don't pay your bill.
In the US, HIPAA is pretty much the strongest privacy legislation there is. There's probably no group that would have a more severe penalty for leaking your info than your healthcare provider.
HIPAA has strict rules with severe penalties, but enforcement is at best spotty. So honest hospitals and doctors offices bend over backwards to comply with the rules at great expense, but bad actors are rarely punished. It's the worst of both worlds. I'm pretty sure that is why the punishments are so harsh, because they need to put the fear of god into practitioners to make them take it seriously since there are so few inspectors.
It's the difference in medical establishment skill level between your doctor and you. You are always at a disadvantage. I've long thought that a disinterested third party needs to be involved. Someone with real oversight taking a position adversarial to the hospital and strictly to create the best possible outcome for the patient.
Perhaps true, but the strongest privacy protections in the US are still pretty weak. The biggest penalty I know of is Anthem 2018, where they leaked HIPAA-qualifying records on 80 million customers. Their financial penalty was a whopping... $16 million. Two dimes per affected customer!
It's true that the US rarely penalizes corporations enough to really disincentivize things, but healthcare providers probably take client data security more seriously than just about any other group besides maybe law firms. It's weird to single them out as being particularly unconcerned with and unpenalized for leaks.
HIPAA was designed for portability -- the 'p' standards for portability not privacy -- of health info, so there are immense carve outs in service of that objective. Fines for violating HIPAA are almost non-existent.
HIPAA is wildly misunderstood by the public as a strong safeguard, meanwhile medical offices just get any patient (a captive audience) to sign a release waiver as part of patient intake ...
How many healthcare providers do you know personally who have faced severe penalties for leaking information?
The reality is that for a small doctor/dental/whatever office, there is essentially 0 risk. HIPAA violations that carry significant penalties go to huge hospitals and healthcare companies.
Your neighborhood doctor has to screw up in a major way for an extended period of time to have a minute risk of any consequence.
How much information do you think your neighborhood PCP is “leaking” compared to, say, Elevance? This is such a goofy take. Are you expecting that every small provider group is just firing your data off on Facebook every Tuesday, and somehow, no one cares? They’re all using certified EMRs. They all take security seriously because their licenses are literally on the line. Do you work in healthcare?
If they provably expose your data, and you report them, they will get fined. Or they would have last year, who knows if those people still have jobs.
I wonder how old the S3 bucket was, because at some point AWS made new S3 buckets private by default.
Which means it's either old, or they recklessly opened it up because they couldn't get files uploaded/downloaded to the bucket from their mobile app/services.
Are y'all gonna blame AWS like you blamed Firebase last week ?
The security procedures I take while hacking out something for my friends at 3am should not extend to products hosting PII. It's up to YOU to implement basic data security.
You definitely need to do this, but a platform should help where possible, and try to have users fall into a 'pit of success' where if a dev just goes with the defaults everything is fine. In this case, S3 buckets should be private and encrypted by default and devs should need to actively choose to switch those things off (which I think may be the case now, but it wasn't in the past.)
> S3 buckets should be private and encrypted by default and devs should need to actively choose to switch those things off
Yeah, that's the case right now. There's multiple screens you have to go to, that almost scream at you that you're making EVERYTHING PUBLIC. Also, in the overview, it distinctly says "!! PUBLIC".
This is like having a small store and instead of locking up at the end of the day, blaming the door for not automatically locking. Yes new automatic locks exist now, but you still need to check.
Cloud technology allows us to build fantastic software very fast. But if you’re too lazy to implement a basic api to get S3 data on a needs to know basis, that’s on you.
AWS makes this very easy. You can’t blame anyone else.
New companies with immature systems, old companies hiring young developers doing side stuff off in their own world, bad default configurations etc
Most importantly there's a large amount of highly incentivized people probing constantly at mass scale. These days it's very easy to scan the internet (github, IPs, domains, etc) for information and "bad S3 configuration" detection is just a script anyone can use. No advanced programming skills required.
HR likely deals with health info related to disability or fmla claims, or work-related injuries that is shared with health care providers and/or insurance companies; this makes them a covered entity subject to the requirements under hipaa.
Protected health information (PHI) under U.S. law is any information about health status, provision of health care, or payment for health care that is created or collected by a Covered Entity (or a Business Associate of a Covered Entity), and can be linked to a specific individual. This is interpreted rather broadly and includes any part of a patient's medical record or payment history.
source: i run Wyndly (YC W21 https://www.wyndly.com), which is most easily understood as a telehealth allergist online.
Sure, that's the definition of PHI but is ESHYFT a HIPAA covered entity? If not then the definition of PHI isn't legally relevant (although they still have an ethical requirement to secure employee data, and might have violated other data protection laws).
What a surprise. How do we, the common people dealing with corporations and governments leaking out information left and right? Even password storage services are not really safe AFAIK.
Even if there are, it’ll be minuscule compared to what is necessary to drive effective change.
The fine for one person’s information from this site should be equivalent to their entire revenue for the year; should not be permitted to be resolved by bankruptcy, and should be required to transfer to any company purchasing their assets.
Their entire executive team should be jailed for a minimum of 3 years per individual offense.
Only then will there be any modicum of an opportunity for us to see some real change.
I recommend we summarily execute people who don’t use the middle lane to go straight at an intersection, blocking everyone else from turning right on red; I feel as if jail time for these executives was pretty reasonable.
I assumed there was more than 10 folks impacted by their poor business decisions; based on the number of images of SS cards, it should be life without parole. Preferably with hard labor in Siberia or western Nebraska.
I am confused, the article seems to be short on details. Was the attack an open S3 bucket? The company in question seems to be hiring for GCP, so I imagine they don’t use S3 at all.
Did the submitter intentionally change the post title to get more clicks?
Multi-cloud isn't uncommon, especially interacting with vendors. It has been a long time since I've worked somewhere that didn't have at least some usage in more than one cloud provider.
It is really unfair, the way capitalism treats nurses (and police officers, school teachers). Without them, the entire system wouldn't even exist. Capitalism may sound like a great idea at first, but in the end you have a few rich bastards milking the rest.
It has nothing to do with capitalism. You can have capitalism and society that is aware that having good public services pays-off, as overall spending on schools, security, etc. will be smaller if done at the whole country level.
In the USA it is not that easy to achieve, as, historically, it is not a single country but a union of "states" that is countries, so the main boss should not interfere too much with local bosses and force on them particular "federal" laws.
That doesn't answer whether it is fair. Capitalism will always push for a smaller government and with all the power they have at their disposal. At the same time, why can capitalists have their rich-making business schemes while nurses and other (semi) public servants are stuck with whatever society decides is good for them? The system is rigged in favor of capitalists.
Close it. Sell off all the assets and give the proceeds as compensation to those whose data was exposed. Why do we have a human death penalty but not a corporate one?
Yeah I remember when Amazons AWS was new and people said "hey its cool but not secure." Then AWS added all these security features but added a caveat: BTW security is your responsibility
Here we are. I guess we can blame the users and not any shitty security architecture slapped on AWS.
Clearly what matters most is that legal culpability be avoided, not that users will be secure. The former is 'shite security' while the latter is good security
The only mistake AWS made was making buckets originally public by default. It’s been many years since that’s been the case. At this point, you have to be completely ignorant to be storing PII in a public bucket.
It's literally, and I do mean this literally, 1 click to block all public traffic to an S3 bucket. It can be enabled at the account level, and is on _by default_ for any new bucket. What exactly more do you want?
> It's literally, and I do mean this literally, 1 click to block all public traffic to an S3 bucket.
I'm reasonably certain that for quite a while blocking all public access has been the default, and it is multiple clicks through scary warnings (through the console; CLI or IaC are simpler) to enable public access.
No that is not the sales pitch to enterprise customers. They are pitching that sys admins are stupid and that security nowadays is too complicated, hence cloud is the only safe solution.
Yet every month I see a story here about an huge data leak from an unrestricted bucket.
I'll need to dig up a source but I recently heard about this company and, apparently, before offering gigs they do a credit report to determine how much debt the person is carrying (i.e. how desperate they are) and they use that information to _round down_ the hourly rate they offer them.
In the unlikely event that there are any negative consequences for this breach, they deserve every bit of them and more.
I don't remember the source, but I believe I listened to a podcast on an "uber for nurses" (not sure if it was this place), but they do all sorts of nasty things that really shaft the nurses. ISTR that the nurses when they get called in, have to be running a phone app that tracks them, and if they get stuck in traffic or lose cell signal, they get demerits. They pretty much do anything they can to give the nurses a demerit, and demerits cause your pay to go down.
So they're pretty much taking the existing terrible nursing environment in healthcare, and weaponizing it. Nurses already have too many patients and not enough CNAs, on top of 12 hour shifts, needing to do charting after those 12 hours. Healthcare squeezes nurses to the breaking point. Data point: my wife is a nurse.
Isn't this exactly what you'd expect from an Uber for (somethign)?
Garbage company, garbage culture, garbage business model.
Well yes, but more so it's how I expect a shitty and perversely structured industry that makes boatloads of money perpetuating a variety of huge barriers to entry to treat the employees who have the least barriers to entry protecting them.
I think I heard the same Podcast - not only do the Apps try and discover the minimum rate a Nurse might take, they’ll actively attempt to manipulate the circumstances of Nurses who were in a strong position so they too end up more dependent and exploitable.
this is the presentation that discusses this wage suppression for nurses.
https://pluralistic.net/2025/02/26/ursula-franklin/
Thanks. This is definitely the source I was referring to.
However, as it applies to my parent comment, the companies mentioned were: Shiftkey, Shiftmed and Carerev. I do not see ENSHYFT mentioned, so I stand corrected.
This is abhorrent if true; truly evil behavior.
What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
But when it comes to private markets and semi-private negotiations that same sentiment doesn't easily transfer. Does society benefit in some unique way for allowing asymmetries in labor negotiations, private markets like Uber, or B2C relations like Robinhood (1,2)?
1. https://www.sec.gov/newsroom/press-releases/2020-321 2. Note, Robinhood was fined not for front-runniny customers, just for falsely claiming customers received quality orders. I suspect theyve only stopped the latter behavior.
> broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong
I don't think that's true at all. Companies and individuals negotiate all the time with information the other party doesn't have. Insider trading is about fairness on public markets so every negotiating party of the same type has the same information, and is quite specific to that.
Incentive wise you're probably a lot better off if your own broker is front running you than if a HFT desk at a liquidity provider firm is doing it since the broker is at least in a position to kick some of that back to you in the form of reduced fees or whatever.
It's definitely shady, but it's par for the course. Uber charges you more if you have more gift cards loaded, or just spend more on average in general. You charge what the market will bear.
You charge what the market will bear, not the individual.
No, it just hasn’t been possible to differentiate as well before.
One example is biscuit manufacturing, where it’s a fairly open secret that supermarket own brand biscuits are the same product as name brand, because it’s better to capture that segment at a lower margin than to lose it to competition.
Tech now makes it possible to target individuals rather than demographics, but there’s nothing inherently against the status quo in doing so.
There's no such thing as "the market", there are market segments that abstractly represent groups of people with similar characteristics. Charging different prices to people in different segments is standard business practice. Burger chains could charge wealthy individuals $100k per burger if they wanted to, just, burger chains usually have difficulty distinguishing the truly wealthy individuals who walk in the door who would have no trouble putting down that kind of money for a burger.
.... which, in the day and age of facial recognition, gives me an idea for a startup.
Burger chains have at least gotten a start on differentiating their pricing - by raising prices dramatically across the board, and telling anyone who’s frugal or just broke that they can only get discounts (to bring prices slightly lower than today’s pricing, but still a lot more than before) if they use the app. Upper-class people don’t bother with it and pay full price, frugal people take the time to figure out the cheapest way to use one of the current “offers” to assemble a meal.
The market is an agglomeration of many individuals, meaning that there is no hard and fast rule that you must charge only one price for the entire market; indeed, many custom-priced products exist, enterprise SaaS being one example.
The market ensure (mostly) there is another individual.
Aren’t they just creating a market of 1?
"just" is doing a lot of heavy lifting here
Pieces of shit. And then they assign you a score for each travel, as if you are really "carpooling" when in reality is a shitty taxi replacement (not that taxis are on a moral high ground, but the point still stands).
Game theory transcends basic humanity.
You might be surprised to learn that they're not the only company to do so.
Names. We need names.
Not of companies. Of the people who choose to work for them (or, rather, choose not to stop working for them after they build these "features").
We don't need names, we need legislature, and we need to vote for people who will write it, as opposed to grifters who only seek to pad the pockets of billionaires.
These predators aren't scared of name and shame. Any publicity is good publicity (And if it actually gets bad, they'll sue the pants off you.). They are scared shitless of laws censuring their behavior. It's why they fight like mad to ensure that they aren't subject to them.
> These predators aren't scared of name and shame.
There are exceptions. See the ongoing kerfuffle over "DOGE" employee lists.
It's amazing that, on a cursory look, only 11 states make this practice illegal. The "AI scriptown" is growing.
Proper data privacy laws would make this sort of thing nearly impossible
I’m interested, given the massive nursing shortages, why any nurses were using this service at all? Especially for higher levels, there’s no reason to mess with a shitty app that underpays you, when you should be able to walk into any provider’s office or facility and get hired almost immediately (and for Runs, you even have wide-ranging telehealth options).
This was my thought exactly. There is a giant nursing shortage. I know some nurses who are traveling nurses and they may bank, and they don't need any BS app. (Just want to emphasize, nursing is an incredibly difficult job at the moment, but there are also currently weird dynamics where traveling nurses can actually make a lot more than "stationary" nurses).
Thus, I'm led to believe that nurses using this app have to have some sort of difficulty finding jobs for other reasons, or they're just not informed about their options.
I imagine many of them are people who can't commit to full or even part-time jobs because of responsibilities like childcare or eldercare; their own physical or mental health issues; etc.
That seems like a terrible way to estimate nurse wages.
People have spouses.
People’s parents pay credit cards.
People with bad credit sometimes don’t care.
People have family money.
People with low debt can be desperate for work.
Does it even work?
At scale, the corner cases don't really matter. In aggregate, if it's decently well correlated and readily available, it's probably going to be used.
I can't find it now, but I believe LexisNexis or another large similar reporting/data agency had a product catalog of dozens of products that spit out values for ability to pay, disposable income monthly, annual income, etc.
It makes you feel awful thinking about the direction things are headed. Corporations approaching omniscient regarding all facts of our lives that are reasonably of value to them.
But I’d argue they aren’t corner cases.
Most people I know with bad credit aren’t desperate for money. At least not educated, highly paid ones like nurses.
Most just ignore their financial problems in the hope they go away.
Not to mention nurse demand outstrips supply, so they have options and can certainly turn down bad offers.
In the section of their Privacy Policy titled Data Security [0]:
> We use certain physical, managerial, and technical safeguards that are designed to improve the integrity and security of information that we collect and maintain. Please be aware that no security measures are perfect or impenetrable. We cannot and do not guarantee that information about you will not be accessed, viewed, disclosed, altered, or destroyed by breach of any of our physical, technical, or managerial safeguards. In particular, the Service is NOT designed to store or secure information that could be deemed to be Protected Health Information as defined by the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”).
IANAL and all that, but I’m not sure you can use the excuse “We didn’t design our system to be HIPAA compliant, sorry,” and hope your liability disappears. Does anyone know?
0: https://eshyft.com/wp-content/uploads/2019/06/ESHYFT-Privacy...
HIPAA applies to patient data not providers data.
> I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
It looks like providers accidentally uploaded some PHI.
IANAL so may be wrong, but I worked for a healthcare company. Whether HIPAA applies to them depends on if they are considered a covered entity or a business associate [0].
IMO they aren't bound to HIPAA requirements as a covered entity.
Business associate is a little tricky to determine. But business associates have to sign a BAA (Business Associate Agreement). And I doubt they would have signed one if they have that in their privacy policy.
Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant..
0: https://www.hhs.gov/hipaa/for-professionals/covered-entities...
> Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant.
You seem to imply using GMail is a bad thing? I think GMail, when appropriately configured to handle PHI, is probably a million times more secure than some crappy bespoke "enterprise" app.
It isn't that hard to setup a secure SFTP server to automate the exchange. But then again this is a post about configuring a S3 Bucket with public access for SSNs.
The issue with Gmail is sending to the wrong email, sending to a broad email list, having people download it to their local machines. And the amount of PHI being transmitted in these files is larger than this s3 bucket.
>It isn't that hard to setup a secure SFTP server to automate the exchange
When you've got a trickle of information coming and going from hundreds or thousands of other individuals working at tens or hundreds of other entities it is.
You'd eventually wind up developing the kind of ridiculous "secure messaging and file drop" type service that every megabank builds on top of their SFTP and ticketing systems for that purpose. That stuff ain't cheap to run and keep running.
Better to just start with a solution that's 99% there.
HIPAA only applies to a very specific entity called a "covered entity". At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
That being said, HIPAA isn't even relevant here because "ESHYFT" is just a provider a labor. No different than a big consultant providing staff augmentation services.
> At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
Again, HIPAA continues to be the most colloquially misunderstood law out there.
The rule that makes providers "covered entities" isn't really about insurance, it's about whether they transmit specific HIPAA "transactions" electronically. Now, yes, most of these transactions having to do with providers are thing like claim submissions or pre-authorizations to insurance. But there are other reasons a provider may need/want to send a HIPAA transaction electronically.
My point is that there isn't some sort of "loophole" where providers that don't accept insurance are somehow being sneaky. The whole point of the HIPAA security rule is to protect PHI when it is transferred around to different entities in the healthcare system. If the information is going just between you and your doctor, HIPAA isn't relevant, and that is by design.
> it's about whether they transmit specific HIPAA "transactions" electronically.
That's correct, but if you don't accept insurance then you will not transmit anything that meets the criteria to be covered by HIPAA. At least, in terms of being a provider. Things are different if you're a health plan or clearing house.
I spent a lot of time and money questioning this with lawyers at a health tech startup I previously worked at. The underlying reality is nearly the entire US healthcare system falls under HIPAA because nearly everyone wants to accept insurance. However, if you're a doctor running a cash-only business you will not be a covered entity, even if you send PHI electronically.
HIPAA doesn't care about your POS TOS. It either applies or does not.
That said, it's both less broad and more toothless than I'd like. If FB convinces you to install a tracking pixel (like button) stealing your private medical data, they likely haven't violated any laws. At most you'd be able to file a claim against the person who created the leak.
Not a lawyer and all that, but for TFA I don't think HIPAA would be a valid way to try to limit your losses. It's a bit closer to what would happen if you (a doctor) uploaded patient data to Google Drive and then somehow leaked that information (one of Google's contractors disclosing it, a hack, whatever). Nothing about ESHYFT's offerings requires or would be benefited by the data HIPAA protects, and (ignoring incompetence and other factors) I'd be as surprised to see my health data leaked there as I would to see a YT video going over my last lab reports because of some hospital's actions.
They could still be liable for all sorts of other damages (and maybe somebody can convince a court of a HIPAA violation), but it's not an easy HIPAA win.
If you're not a direct health provider, you probably can. Don't take that as an endorsement.
If you partner with a healthcare provider to provide any sort of technical services, you will be required to sign a BAA (Business Associates Agreement), which makes you similarly liable to the HIPAA & HITECH acts.
It depends there are some exceptions.[0]
>With persons or organizations (e.g., janitorial service or electrician) whose functions or services do not involve the use or disclosure of protected health information, and where any access to protected health information by such persons would be incidental, if at all.
Based on the context from the article of the PHI uploaded being incidental, it would probably fall under this exception. It sounds like ESHYFT isn't meant to be storing any PHI based on the privacy policy above.
0:https://www.hhs.gov/hipaa/for-professionals/privacy/guidance...
[Nevermind]
The PII of the nurses being accidentally shared by a staffing agency isn't a HIPAA violation. Yes the nurses are providers but their relationship with the Uber for nurses service isn't a medical provider relationship. It's definitely a legal and ethical failing but I don't think it's a HIPAA one.
This is what I took away from the reading. It's basically a shift/employee management platform. The only reason we're even discussing HIPAA is because health care industry adjacent.
If you replaced nurses with gig workers and uber for nurses with something like WeWork this would just be like every other leak we talk about on HN.
HIPAA avoidance is much narrower than that. Entities which perform administrative or managerial duties on behalf of a mandated organization that have to transmit PII to provide that service are also covered, even if the entity itself isn't a provider.
If 'Uber for nurses' is acting on behalf of nurses, it probably doesn't apply? If it's acting on behalf of the hospitals (who are indisputably covered entities), then the situation is much less clear.
I encountered a similar situation with my startup many years ago and decided "better safe than sorry" after consulting the lawyer.
I used to work in the field. HIPAA protects patient data, not provider data. If my understanding is correct that only nurse PII was leaked, this has nothing to do with HIPAA.
In general, I've found that people tend to think HIPAA applies much, much more than it actually does. Like people thinking if you're in a meeting at work with clients and say "Sorry, Bob couldn't be here today, he's got the flu" that that's a HIPAA violation. No, it's not.
This is just an employee data leak, just like a bajillion other employee data leaks. The fact that the employees happen to be nurses still doesn't mean it has anything to do with HIPAA.
ESHYFT isn't a covered entity, so HIPAA doesn't apply to them. Even if they have health data of their employees in their system, they're still not a covered entity.
Really, "Uber for Nurses" is a title to drum up interest. "Large Staffing Service" would be factually accurate.
This 100%. This needs to be a top level comment.
Ah, doing more than skimming the article
>I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
The title is exaggerating what the article says and the article is making a big stretch about this being possibly HIPAA covered, I stand corrected, this has nothing to do with HIPAA.
What was leaked was nurses' doctors notes submitted justifying calling out of work. Still a serious leak but nowhere near what is being suggested.
I'm confused because the article lays it out by the 4th paragraph, and you have the right understanding, up until "we're a startup"
Maybe you think the startup maintains patient records?
The article lays out the nurses uploaded them, the provider. This is a temp booking system. The health records were uploaded by the nurses to communicate reasons for absences to their employee and weren't required or requested
They have as much responsibility as Dropbox does. Nurses shouldn't have uploaded them.
Worth mentioning, because the authority level of medical practitioners throws people off. Don't ever give a doctor or practice your Social Security Number. They don't need it. Similarly if they want to check an ID that doesn't mean scan or photograph. Doctors, practices, etc are the worst at infosec. They have no training, basically no penalties if they do something wrong and all of that info is only to follow up in case you don't pay your bill.
In the US, HIPAA is pretty much the strongest privacy legislation there is. There's probably no group that would have a more severe penalty for leaking your info than your healthcare provider.
HIPAA has strict rules with severe penalties, but enforcement is at best spotty. So honest hospitals and doctors offices bend over backwards to comply with the rules at great expense, but bad actors are rarely punished. It's the worst of both worlds. I'm pretty sure that is why the punishments are so harsh, because they need to put the fear of god into practitioners to make them take it seriously since there are so few inspectors.
It's the difference in medical establishment skill level between your doctor and you. You are always at a disadvantage. I've long thought that a disinterested third party needs to be involved. Someone with real oversight taking a position adversarial to the hospital and strictly to create the best possible outcome for the patient.
The Hippocratic model isn't awesome.
In 2025 an oath don't mean shit.
Perhaps true, but the strongest privacy protections in the US are still pretty weak. The biggest penalty I know of is Anthem 2018, where they leaked HIPAA-qualifying records on 80 million customers. Their financial penalty was a whopping... $16 million. Two dimes per affected customer!
It's true that the US rarely penalizes corporations enough to really disincentivize things, but healthcare providers probably take client data security more seriously than just about any other group besides maybe law firms. It's weird to single them out as being particularly unconcerned with and unpenalized for leaks.
We saw ours input PII into a Windows box. The idea that their ActiveX monstrosity has any security is not very persuasive.
HIPAA was designed for portability -- the 'p' standards for portability not privacy -- of health info, so there are immense carve outs in service of that objective. Fines for violating HIPAA are almost non-existent.
HIPAA is wildly misunderstood by the public as a strong safeguard, meanwhile medical offices just get any patient (a captive audience) to sign a release waiver as part of patient intake ...
PCI-DSS is the strongest, HIPAA is just a rubber stamp
That's not actually law at all. It's part of the contract with payment processors.
How many healthcare providers do you know personally who have faced severe penalties for leaking information?
The reality is that for a small doctor/dental/whatever office, there is essentially 0 risk. HIPAA violations that carry significant penalties go to huge hospitals and healthcare companies.
Your neighborhood doctor has to screw up in a major way for an extended period of time to have a minute risk of any consequence.
How much information do you think your neighborhood PCP is “leaking” compared to, say, Elevance? This is such a goofy take. Are you expecting that every small provider group is just firing your data off on Facebook every Tuesday, and somehow, no one cares? They’re all using certified EMRs. They all take security seriously because their licenses are literally on the line. Do you work in healthcare?
If they provably expose your data, and you report them, they will get fined. Or they would have last year, who knows if those people still have jobs.
Only the young and inexperienced believe the law is enforced when it matters.
And yet the data still seems to leak pretty frequently...
Eh.
Last year the total HIPAA violations fines were less than $9.2 million.
A figure I could find for hospital revenue in the same year which is a good enough proxy for fines vs revenue is about $1.2 trillion.
Which rounding because who cares comes to 0.001% of medical revenue ends up being paid for HIPAA violation fines.
Or the equivalent ratio of about a cup of coffee for a typical enough person per year.
HIPAA needs teeth, what it says you're supposed to do is quite strong, the enforcement of it is pathetic.
What do you do if they refuse to book an appointment without it?
I've never had that happen (sample size ~5). They accept non-citizen patients, so they probably don't make SSN a required field.
(for SSN, never tried to prevent scanning of my ID)
Find a new provider. I have gone 2 decades without providing my SSN to doctors.
New provider is unrealistic for many in USA. In NYC, maybe easy; in rural WI/KS much less so.
You can just use my SSN: 123-45-6789.
[dead]
I wonder how old the S3 bucket was, because at some point AWS made new S3 buckets private by default.
Which means it's either old, or they recklessly opened it up because they couldn't get files uploaded/downloaded to the bucket from their mobile app/services.
Also possible a webdev opened it up so they could use the assets on a website, and didn't think about other private data in the bucket.
Are y'all gonna blame AWS like you blamed Firebase last week ?
The security procedures I take while hacking out something for my friends at 3am should not extend to products hosting PII. It's up to YOU to implement basic data security.
It's up to YOU to implement basic data security.
You definitely need to do this, but a platform should help where possible, and try to have users fall into a 'pit of success' where if a dev just goes with the defaults everything is fine. In this case, S3 buckets should be private and encrypted by default and devs should need to actively choose to switch those things off (which I think may be the case now, but it wasn't in the past.)
> S3 buckets should be private and encrypted by default and devs should need to actively choose to switch those things off
Yeah, that's the case right now. There's multiple screens you have to go to, that almost scream at you that you're making EVERYTHING PUBLIC. Also, in the overview, it distinctly says "!! PUBLIC".
This is like having a small store and instead of locking up at the end of the day, blaming the door for not automatically locking. Yes new automatic locks exist now, but you still need to check.
Cloud technology allows us to build fantastic software very fast. But if you’re too lazy to implement a basic api to get S3 data on a needs to know basis, that’s on you.
AWS makes this very easy. You can’t blame anyone else.
Why "Uber for nurses" and not the actual company name in the title?
According to the article the name is ESHYFT. It sounds like a brand of electronic found on aliexpress but with less quality!
Please invest in my startup, ENSHITIFY
Which one?!
It lets me know the company is bullshit in a way the company name never would.
Why does this keep happening? It seems like every month there's a new leak from an open S3 bucket?
New companies with immature systems, old companies hiring young developers doing side stuff off in their own world, bad default configurations etc
Most importantly there's a large amount of highly incentivized people probing constantly at mass scale. These days it's very easy to scan the internet (github, IPs, domains, etc) for information and "bad S3 configuration" detection is just a script anyone can use. No advanced programming skills required.
Are we pretending that there are still functional regulatory agencies that are able to take action over this?
Sorry for the dude that built their infra and was really tired and then woke up to this, what a bummer.
Move fast and violate HIPAA.
Does HIPAA apply to HR into, or just patient health data?
HR likely deals with health info related to disability or fmla claims, or work-related injuries that is shared with health care providers and/or insurance companies; this makes them a covered entity subject to the requirements under hipaa.
Protected health information (PHI) under U.S. law is any information about health status, provision of health care, or payment for health care that is created or collected by a Covered Entity (or a Business Associate of a Covered Entity), and can be linked to a specific individual. This is interpreted rather broadly and includes any part of a patient's medical record or payment history.
source: i run Wyndly (YC W21 https://www.wyndly.com), which is most easily understood as a telehealth allergist online.
Sure, that's the definition of PHI but is ESHYFT a HIPAA covered entity? If not then the definition of PHI isn't legally relevant (although they still have an ethical requirement to secure employee data, and might have violated other data protection laws).
https://www.hhs.gov/hipaa/for-professionals/covered-entities...
Yes, but you're missing a massive caveat that is conditional on the definition of "covered entity".
Covered Entity has a narrow meaning. Notably, if you don't accept insurance, it's very unlikely you're a covered entity.
It considers non-health-specific identifying info about patients that might be stored with the health-specific info to also be PHI.
The linked article does not mention Amazon S3 or AWS
Is there a different source for the "open S3 bucket" in HN title?
Would be surprised if this company makes it out of this. Medical records…. Yikes
A company of this size definitely wouldn’t be able to tank a multimillion dollar lawsuit.
What a surprise. How do we, the common people dealing with corporations and governments leaking out information left and right? Even password storage services are not really safe AFAIK.
Well there be any consequences for the company?
No. And hence this will keep happening.
Even if there are, it’ll be minuscule compared to what is necessary to drive effective change.
The fine for one person’s information from this site should be equivalent to their entire revenue for the year; should not be permitted to be resolved by bankruptcy, and should be required to transfer to any company purchasing their assets.
Their entire executive team should be jailed for a minimum of 3 years per individual offense.
Only then will there be any modicum of an opportunity for us to see some real change.
Your proposal is so bizarrely out of proportion with the harm caused that I can’t tell if it’s parody or not. Why not execute them while you’re at it?
That would be cruel and unusual — their families and friends would needlessly suffer. They'll need to be executed too.
I recommend we summarily execute people who don’t use the middle lane to go straight at an intersection, blocking everyone else from turning right on red; I feel as if jail time for these executives was pretty reasonable.
> Their entire executive team should be jailed for a minimum of 3 years per individual offense.
This is so over the top reactionary and stupid, I can't help but write off your entire comment.
You want the Chief Accounting Officer to go to jail for 3 decades because of a data breach?
I assumed there was more than 10 folks impacted by their poor business decisions; based on the number of images of SS cards, it should be life without parole. Preferably with hard labor in Siberia or western Nebraska.
Title: Thousands of Records, Including PII, Exposed Online in Healthcare Marketplace Connecting Facilities and Nurses Data Leak
(vs current: "'Uber for nurses' exposes 86K+ medical records, PII via open S3 bucket")
I am confused, the article seems to be short on details. Was the attack an open S3 bucket? The company in question seems to be hiring for GCP, so I imagine they don’t use S3 at all.
Did the submitter intentionally change the post title to get more clicks?
https://eshyft.com/careers/gcp-devops-engineer/
Multi-cloud isn't uncommon, especially interacting with vendors. It has been a long time since I've worked somewhere that didn't have at least some usage in more than one cloud provider.
Always. Always the open bucket.
It is really unfair, the way capitalism treats nurses (and police officers, school teachers). Without them, the entire system wouldn't even exist. Capitalism may sound like a great idea at first, but in the end you have a few rich bastards milking the rest.
It has nothing to do with capitalism. You can have capitalism and society that is aware that having good public services pays-off, as overall spending on schools, security, etc. will be smaller if done at the whole country level.
In the USA it is not that easy to achieve, as, historically, it is not a single country but a union of "states" that is countries, so the main boss should not interfere too much with local bosses and force on them particular "federal" laws.
That doesn't answer whether it is fair. Capitalism will always push for a smaller government and with all the power they have at their disposal. At the same time, why can capitalists have their rich-making business schemes while nurses and other (semi) public servants are stuck with whatever society decides is good for them? The system is rigged in favor of capitalists.
What makes this uber for nurses?
It's Kelly Services for nurses, but Uber sounded cooler 10 years ago
Close it. Sell off all the assets and give the proceeds as compensation to those whose data was exposed. Why do we have a human death penalty but not a corporate one?
Annual reminder that the P in HIPAA stands for Portability, not Privacy.
Uber for ___ has lost all meaning
I wish private data was more independently audited.
Now if only this data found its way to some union organizer.
When do we start making that kind of thing criminal negligence with prison sentences for this kind of bullshit…
[dead]
Yeah I remember when Amazons AWS was new and people said "hey its cool but not secure." Then AWS added all these security features but added a caveat: BTW security is your responsibility
Here we are. I guess we can blame the users and not any shitty security architecture slapped on AWS.
Clearly what matters most is that legal culpability be avoided, not that users will be secure. The former is 'shite security' while the latter is good security
The only mistake AWS made was making buckets originally public by default. It’s been many years since that’s been the case. At this point, you have to be completely ignorant to be storing PII in a public bucket.
> shitty security architecture slapped on AWS
It's literally, and I do mean this literally, 1 click to block all public traffic to an S3 bucket. It can be enabled at the account level, and is on _by default_ for any new bucket. What exactly more do you want?
> It's literally, and I do mean this literally, 1 click to block all public traffic to an S3 bucket.
I'm reasonably certain that for quite a while blocking all public access has been the default, and it is multiple clicks through scary warnings (through the console; CLI or IaC are simpler) to enable public access.
I thought the cloud was safe, that is why you pay premium.
They just sell pick axes, they don't care if you plant one right through your foot.
No that is not the sales pitch to enterprise customers. They are pitching that sys admins are stupid and that security nowadays is too complicated, hence cloud is the only safe solution.
Yet every month I see a story here about an huge data leak from an unrestricted bucket.