this post was submitted on 27 Jul 2025
553 points (99.1% liked)
Technology
73379 readers
4723 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To avoid people from simply copying the "age proof" and having others reuse it, a nonce/private key combo is needed. To protect that key a DRM style locked down device is necessary. Conveniently removing your ability to know what your device is doing, just a "trust us".
Seeing the EU doesn't make any popular hardware, their plan will always rely on either Asian or US manufacturers implementing the black-box "safety" chip.
If it is about hiding some data handled by the app, that will be instantly extracted.
There are plenty of people with full integrity on rooted phones. It's really annoying to set up and keep going, and requiring that would fuck over most rooted phone/custom os users, but someone to fully inspect and leak everything about the app will always be popping up.
Look at the design of DRM chips. They bake the key into hardware. Some keys have been leaked, I think playstation 2 is an example, but typically by a source inside the company.
That applies to play integrity, and a lot of getting that working is juggling various signatures and keys.
The suggestion above which I replied to was instead about software-managed keys, something handed to the app which it then stores, where the google drm is polled to get that sacred piece of data. Since this is present in the software, it can be plainly read by the user on rooted devices, which hardware-based keys cannot.
Play integrity is hardware based, but the eu app is software based, merely polling googles hardware based stuff somewhere in the process.
I understand. In the context of digital sovereignty, even if the linked shitty implementation is discarded (as it should be), every correct implementation will require magic DRM-like chip. This chip will be made by a US or Asian manufacturer, as the EU has no manufacturing.
The key doesn't have to be on your phone. You can just send it to some service to sign it, identifying yourself to that service in whatever way.
It's that "whatever way" that is difficult. This proposal merely shifts the problem: now the login to that 3rd party can be shared, and age verification subverted.
A phone can also be shared. If it happens at scale, it will be flagged pretty quickly. It's not a real problem.
The only real problem is the very intention of such laws.
How? In a correct implementation, the 3rd parties only receive proof-of-age, no identity. How will re-use and sharing be detected?
There are 3 parties:
The site (2) sends the request to the user (1), who passes it on to the service (3) where it is signed and returned the same way. The request comes with a nonce and a time stamp, making reuse difficult. An unusual volume of requests from a single user will be detected by the service.