Which is a complete non-issue. It’s $99 / year, basically a symbolic amount just high enough to prevent spammers from making a billion accounts.
BorgDrone
I have no problems with this. Notarizing your app is trivial and takes just a few minutes. As a user I want to know who actually produced an app and ensure it wasn’t tampered with.
It’s because the company can effectively print whatever they like for the name of the product with no regard to the actual ingredients.
That is not true at all. There are laws that determine what you can actually put on your products, especially on food.
don't let apple tell you they invented it.
Why always the knee-jerk anti-apple reaction even if they do something good?
FYI: Apple isn’t telling anyone they invented this. In fact, they didn’t even tell anyone about this feature and declined to comment after it was discovered and people started asking questions.
I wish TLoU 1 gave you the option to sacrifice Ellie. Have an alternate ending where they find a vaccine and everyone lives happily ever after (except Ellie).
I mean, you have to explicitly give permission before apps can access the camera.
How can an app turn on the camera without your consent?
Showing it’s unsustainable is kind of the point of the original game Monopoly is based on: The landlord’s game.
And yet, I’ve never run into RAM problems on iPhones, both as a user and as a developer. On iOS an app can use almost all the RAM if needed, as long as your app is running in the foreground. Android by contrast is much stingier with RAM, especially with Java/Kotlin apps. There are some hard limits on how much RAM you can actually use and it’s a small fractIon of the total amount. The actual limit is set by the manufacturer and differs per device, Android itself only guarantees a minimum of 16MB per app.
The reason is probably because Android is much more lenient with letting stuff run in the background so it needs to limit the per-app memory usage.
Those apps also use more RAM than an equivalent iOS app, simply because they run on a garbage-collected runtime. With a GC there is a trade-off between performance and memory usage. A GC always wastes memory, as memory isn’t freed immediately once no longer in use. It’s only freed when the GC runs. If you run it very often you waste little RAM at the cost of performance (all the CPU cycles used by the GC) if you run it at large intervals you waste a lot of RAM (because you let a lot of ‘garbage’ accumulate before cleaning it up). In general, to achieve similar performance to non-GC’d code you need to tune it so it uses about 4 times as much RAM. The actual overhead depends on how Google tuned the GC in ART combined with the behavior of specific apps.
Note that this only applies to apps running in ART, many system components like the web browser are written in C++ and don’t suffer from this inefficiency. But it does mean Android both uses more RAM than iOS while at the same time giving apps less RAM to actually use.
It basically comes down to different architectural choices made by Google and Apple.
It’s $99 a year. I wish my hobbies were that cheap.