this post was submitted on 10 Dec 2023
55 points (98.2% liked)

Selfhosted

40313 readers
287 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have several TB of borg backups. Uploaded them on backblaze b2. I could immediately see how much resources i was using, how many api calls, and so on. Very easy to see and predict the next bill. I can see exactly which bucket uses more resource, and which is growing over time.

Because I'm cheap, I want to upload those files on aws glacier, which theoretically costs a quarter of b2 for storage, but API calls are extremely expensive. So I want to know the details. I won't like to get a bill with $5 in storage and $500 in API calls.

Uploaded a backup, but nowhere in AWS I can see how much resources i am using, how much I'm going to pay, how many API calls, how much the user XYZ spent, and so on.

It looks like it's designed for an approach like "just use our product freely, don't worry about pricing, it's a problem for the financial department of your company".

In AWS console I found "s3 storage lens", but it says i need to delegate the access to someone else because reasons. Tried to create another user in my 1-user org, but after wasting 2 hours I wasn't able to find a way to add those permissions.

Tried to create a dashboard in "AWS cost explorer" but all the indicators are null or zero.

So, how can I see how many API calls and storage is used, to predict the final bill? Or the only way is to pray and wait the end of the month and hopefully there everything it's itemized in detail?

you are viewing a single comment's thread
view the rest of the comments
[–] DeltaTangoLima@reddrefuge.com 18 points 11 months ago* (last edited 11 months ago) (1 children)

As many others have said, AWS have a pricing calculator that lets you determine your likely costs.

As a rough calc in the tool for us-east-2 (Ohio), if you PUT (a paid action) 1,000 objects per month of 1024MB each (1TB), and lifecycle transitioned all 1,000 objects each month into Glacier Deep Archive (another paid action), you'll pay around $1.11USD per month. You pay nothing to transfer the data IN from the internet.

Glacier Deep Archive is what I use for my backups. I have a 2N+C backup strategy, so I only ever intend to need to restore from these backups should both of my two local copies of my data are unavailable (eg. house fire). In that instance, I will pay a price for retrieval, as well as endure a waiting period.