this post was submitted on 20 Nov 2024
48 points (90.0% liked)

Technology

59495 readers
2968 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Israeli forces are using an AI weapons system in Gaza co-produced by an Indian defence company that turns machine guns and assault rifles into computerised killing machines, Middle East Eye can reveal.

According to documents and news reports seen by MEE, Israeli forces have been using the Arbel weapons system in Gaza following their devastating invasion of the enclave after the 7 October attacks on southern Israel.

Touted as a "revolutionary game changer that improves operator lethality and survivability," the Arbel system enhances machine guns and assault weapons - such as the Israeli-produced Tavor, Carmel and Negev - into a weapon that uses algorithms to boost soldiers chances of hitting targets more accurately and efficiently.

Although defence analysts say the weapon system may not be as cutting-edge or as widely used as the "Lavender" or "The Gospel" AI weapons systems - that are reported to have played a huge role in the tremendous death toll in Gaza - Arbel appears to be the first weapons system to directly tie India to Israel's rapidly expanding AI war in Gaza in what could have wide-ranging implications for other conflicts.

all 9 comments
sorted by: hot top controversial new old
[–] Vanth@reddthat.com 18 points 16 hours ago (3 children)

Am I reading the techno-babble accurately?

You would muzzle sweep your target with the trigger pressed, and it would fire as your gun is actually aimed for the most-probable strike. If off target = it doesn't fire = bullets saved and probably better targeting because the shooter isn't dealing with as much recoil.

So even less training needed and even further removed from human decision making. Soldiers didn't murder that unarmed civilian, AI did.

[–] agelord@lemmy.world 4 points 12 hours ago (1 children)

The AI didn't press the trigger, soldiers did.

[–] NeoNachtwaechter@lemmy.world 10 points 15 hours ago (2 children)

Soldiers didn't murder that unarmed civilian, AI did.

Soldier did the (rough) aiming.
Soldier pulled the trigger.

Still hard to blame AI for it, don't you think?

[–] catloaf@lemm.ee 8 points 15 hours ago

For reasonable people. For others, it's an avenue to get away with war crimes.

[–] TseseJuer@lemmy.world -5 points 15 hours ago (1 children)
[–] ComradeMiao@lemmy.world 1 points 12 hours ago

Who you’re responding to is speaking against the soldier.

[–] just_another_person@lemmy.world 6 points 16 hours ago

Yes, you're relying on an offline inference device to make trigger choices. Basically "if brown, shoot" from what I gather.

[–] sunzu2@thebrainbin.org 5 points 14 hours ago

US tech companies are funding and supporting Israeli firms who are beta testing "ai" surveillance tools in Gaza and west bank...

Fake news never cover this one...