So it’s not just an unlucky mistake. Its not just a poorly designed deployment process. They ignored actual smaller scale failures.
So it’s not just an unlucky mistake. Its not just a poorly designed deployment process. They ignored actual smaller scale failures.
This interview it’s wild
Washington has fairly strong whistleblower protections. Even if I was disposed toward taking retaliation against somebody in that situation, and I’m not, I’d be very remiss to do that in the state of Washington.
Um wat
So is this what Mozilla meant when they announced a privacy push back in February
How’s it compare to greenshot?
In short, yes, i support those other things.
… but back to the geo engineering are you saying that we aren’t on the brink yet? Are you saying its not that bad?
That I don’t understand why we’re not doing more of those two things. Geo engineering seams to have strong opposition even within climate activist circles, and nuclear power use is on the decline.
Disaster is nearing. Mass displacement. Mass starvation. Mass death. It is all imminent. Do you understand?
That sound really bad. So then I have a couple questions.
… if no then how could the consequences of that be worse then mass death and starvation?
California has pushed out badly worded laws in the past. Here’s a definition from the bill.
“Artificial intelligence model” means an engineered or machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs that can influence physical or virtual environments and that may operate with varying levels of autonomy.
Tell me that wouldn’t also apply to a microwave oven.
After several years of using Linux for work and school, I made the leap to daily driving linux on my personal computer. I stuck with it for two years. Hundreds of hours I sunk into an endless stream of inane troubleshooting. Linux preys on my desire to fix stuff and my insane belief that just one more change, suggested by just one more obscure forum post will fix the issue.
… the lack of an increment operation, no “continue” instruction, and array indices starting from 1 instead of 0. These differences can be jarring
Understatement
It depends. It will not affect many of them until 2025 when enterprise support for v2 ends and by then other arrangements and fixes might be. Brave in particular I would not worry yet.
Something I often see missing from discussion on privacy is that it’s not always about you, the listener. Sometimes it’s about protecting the most vulnerable people around you. For example, someone escaping from domestic violence might have a different view on how their information is protected. People struggle to see the value in privacy because it’s not been a big problem for them personally or because they think it’s hopeless. An introduction to privacy in my view is all about teaching empathy, hope, and advocating for others.
Once they have that goal in mind, you can tie in how open source helps empower people to take back their privacy
This has nothing to do with the Files app, nor does it have anything to do with re-indexing of the Photos library. This has to do with fighting CSAM. Apple has started (in this or a previous update), to scan your device (including deleted files) for anything containing nudity (search for “brasserie”) and adding it to your photos library in a way that it is hidden. That way, anything that the models detect as nudity is stored in your iCloud database permanently. Apple is doing this because it allows them to screen for unknown CSAM material. Currently it can only recognize known fingerprints, but doing this allows them (and the other parties that have access to your iCloud data) to analyze unknown media.
The bug mentioned here accidentally made those visible to the user. The change visible updates the assets in the library in a way that removes the invisibility flag, hence people noticing that there are old nudes in their library that they cannot delete.
…
And speaking of deleting things, things are never really deleted. The iPhone keeps a record of messages you delete and media, inside the KnowledgeC database. This is often used for forensic purposes. Apple is migrating this to the Biome database, which has the benefit of being synchronized to iCloud. It is used to feed Siri with information, among other things. Anything you type into your devices, or fingerprints of anything you view are sent to Apple’s servers and saved. Spooky, if you ask me. But the only way we can have useful digital assistants is when they have access to everything, that’s just how it works.
Nudes are meant to persist on iPhone. You’re just not meant to notice.
I wonder how good this model would be at an obfuscated code challenge.
This is all they really said IMO:
My tendency these days is to try to use the term “machine learning” rather than AI
The initial results showed something that should have been obvious to anyone: *More data beats more parameters.
That makes a lot of sense!
Might be factoring in more than just state income tax. There’s also sales tax, property tax, etc.
Purely speculation but, I wonder if this is a case of having some old, very low quality photos and trying to enhance and upscale them for the show.
Which comment?