Domestic drones – Informed Comment https://www.juancole.com Thoughts on the Middle East, History and Religion Mon, 03 Jun 2024 20:08:49 +0000 en-US hourly 1 https://wordpress.org/?v=5.8.10 Israel’s AI-Powered Genocide https://www.juancole.com/2024/06/israels-powered-genocide.html Tue, 04 Jun 2024 04:06:36 +0000 https://www.juancole.com/?p=218906 by Sarmad Ishfaq

We are witnessing the genocide of the Palestinians based on algorithms and machine learning; a system of apartheid in the Israeli-occupied West Bank and Gaza Strip reinforced by artificial intelligence; and surveillance and facial recognition systems of such prowess that Orwell’s 1984 regime would be green with envy. Today’s Israeli-occupied Palestine manifests a dystopian and totalitarian sci-fi movie script as far as the Palestinians are concerned. Moreover, the Zionists are fuelling this AI nightmare.

From the onset of its current war against the Palestinians in Gaza, the Zionist regime, blinded by revenge for 7 October, has leveraged AI in the most indiscriminate and barbaric way to kill tens of thousands of innocent Palestinian civilians. One such insidious AI tool that has dominated the headlines is The Gospel. Since last October, Israel has utilised this AI system to expedite the creation of Hamas targets. More specifically, The Gospel marks structures and buildings that the IDF claims Hamas “militants” operate from. This fast-paced target list, Israel’s disinclination to adhere to international humanitarian law, as well as US support emboldening Prime Minister Benjamin Netanyahu’s government, has led to a modern-day genocide.

The Gospel is used by Israel’s elite 8200 cyber and signals intelligence agency to analyse “communications, visuals and information from the internet and mobile networks to understand where people are,” a former Unit 8200 officer explained. The system was even active in 2021’s offensive, according to Israel’s ex-army chief of staff Aviv Kochavi: “…in Operation Guardian of the Walls [in 2021], from the moment this machine was activated, it generated 100 new targets every day… in the past [in Gaza] we would create 50 targets per year. And here the machine produced 100 targets in one day.”

Israel’s apathetic disposition towards civilians is evident as it has given the green light to the killing of many innocents in order to hit its AI-generated targets whether they be hospitals, schools, apartment buildings or other civilian infrastructure. For example, on 10 October last year, Israel’s air force bombed an apartment building killing 40 people, most of them women and children. Israel has also used dumb bombs that cause more collateral damage instead of guided munitions to target low-level Hamas leadership targets. “The emphasis is on damage and not on accuracy,” said the Israel Defence Forces’ own spokesperson. This is the primary reason why the death toll of civilians is so high, as is the number of those wounded. According to one study, the current war’s total number of 110,000 and growing Palestinian casualties (most of them civilians), is almost six times more than the Palestinian casualties in the previous five military offensives combined, which stands at 18,992.


“Lavender 3,” digital, Dream/ Dreamworld v. 3, 2024.

Two other AI programmes that have made the news recently are Lavender and Where’s Daddy? Lavender differs from The Gospel in that the former marks human beings as targets (creating a kill list), whereas the latter marks buildings and structures allegedly used by combatants. Israeli sources claim that in the first weeks of the war, Lavender was overly utilised and created a list of 37,000 Palestinians as “suspected militants” to be killed via air strikes. As previously mentioned, Israel’s apathy towards the Palestinians has been evident in their mass killing of civilians in order to eliminate even a single Hamas member. According to Israeli military sources, the IDF decided initially that for every junior Hamas member, 15 or 20 civilians could be killed.

 

This brutality is unprecedented.

If the Hamas member was a senior commander, the IDF on several occasions okayed the killing of over 100 civilians to fulfil its objective. For example, an Israeli officer said that on 2 December, in order to assassinate Wissam Farhat, the commander of Shuja’iya Battalion of the military wing of Hamas, the IDF knew that it would kill over 100 civilians and went ahead with the killing.

If this was not contentious enough, Lavender was also used without any significant checks and balances. On many occasions, the only human scrutiny carried out was to make sure that the person in question was not a female. Beyond this, Lavender-generated kill lists were trusted blindly.

However, the same sources explain that Lavender makes mistakes; apparently, it has a 10 per cent error rate. This implies that at times it tagged innocent people and/or individuals with loose connections to Hamas, but this was overlooked purposefully by Israel.

Moreover, Lavender was also programmed to be sweeping in its target creation. For example, one Israeli officer was perturbed by how loosely a Hamas operative was defined and that Lavender was trained on data from civil defence workers as well. Hence, such vague connections to Hamas were exploited by Israel and thousands were killed as a result. UN figures confirm the use of such a devastating policy when, during the first month of the war, more than half of the 6,120 people killed belonged to 1,340 families, many of which were eliminated completely.

 

Where’s Daddy? is an AI system that tracks targeted individuals so that the IDF can assassinate them. This AI along with The Gospel, Lavender and others represent a paradigm shift in the country’s targeted killing programme. In the case of Where’s Daddy? the IDF would purposefully wait for the target to enter his home and then order an air strike, killing not only the target but also his entire family and other innocents in the process. As one Israeli intelligence officer asserted: “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” In an even more horrific turn, sometimes the targeted individual would not even be at home when the air strike was carried out, due to a time lapse between when Where’s Daddy? sent out an alert and when the bombing took place. The target’s family would be killed but not the target. Tens of thousands of innocent Palestinians, primarily women and children, are believed to have been killed because of this.

Israeli AI software also permeates the occupied West Bank, where it is a part of everyday Palestinian life.

In Hebron and East Jerusalem, Israel uses an advanced facial recognition system dubbed Red Wolf. Red Wolf is utilised to monitor the movements of Palestinians via the many fixed and “flying” security checkpoints. Whenever Palestinians pass through a checkpoint, their faces are scanned without their approval or knowledge and then checked against other Palestinian biometric data. If an individual gets identified by Red Wolf due to a previous detention, or their activism or protests, it decides automatically if this person should be allowed to pass or not.

If any person is not in the system’s database, their biometric identity and face are saved without consent and they are denied passage. This also means that Israel has an exhaustive list of Palestinians in its database which it uses regularly to crack down not just on so-called militants, but also on peaceful protesters and other innocent Palestinians. According to one Israeli officer, the technology has “falsely tagged civilians as militants.” Moreover, it is highly likely that Red Wolf is connected to two other military-run databases – the Blue Wolf app and Wolf Pack. IDF soldiers can even use their mobile phones to scan Palestinians and access all private information about them. According to Amnesty International, “Its [Red Wolf’s] pervasive use has alarming implications for the freedom of movement of countless Palestinians…”

For years, Israel has used the occupied Palestinian territories as a testing ground for its AI products and spyware. In fact, the country sells “spyware to the highest bidder, or to authoritarian regimes with which the Israeli government wanted to improve relations” on the basis that they have been “field tested”. Israel’s own use of this technology is the best advertisement for its products. The head of Israel’s infamous Shin Bet internal spy agency, Ronen Bar, has stated that it is using AI to prevent terrorism and that Israel and other countries are forming a “global cyber iron dome”. The glaring issue here is Israel’s violation of Palestinian rights through spying on their social media as well as wrongful detention, torture and killing innocent people. “The Israeli authorities do not need AI to kill defenceless Palestinian civilians,” said one commentator. “They do, however, need AI to justify their unjustifiable actions, to spin the killing of civilians as ‘necessary’ or ‘collateral damage,’ and to avoid accountability.”

Israel has entered into a controversial $1.2 billion contract with Google and Amazon called Project Nimbus, which was announced in 2021. The project’s aim is to provide cloud computing and AI services for the Israeli military and government. This will allow further surveillance and the illegal collection of Palestinian data. Google and Amazon’s own employees dissented and wrote an article to the Guardian expressing their discontent about this. “[O]ur employers signed a contract… to sell dangerous technology to the Israeli military and government. This contract was signed the same week that the Israeli military attacked Palestinians in the Gaza Strip – killing nearly 250 people, including more than 60 children. The technology… will make the systematic discrimination and displacement carried out by the Israeli military and government even… deadlier for Palestinians.” The contract reportedly has a clause that disallows Google and Amazon to leave the contract so the companies’ acquiescence is axiomatic. According to Jane Chung, spokeswoman for No Tech For Apartheid, over 50 Google employees have been fired without due process due to their protests against Project Nimbus.

The Palestinians are perhaps the bravest people in the world.

Whether contained within the barrel of a gun, a bomb casing, or in the code of an AI system, Israeli oppression will never deter them from standing up for their legitimate rights. Their plight has awoken the world to the nature of the Israeli regime and its brutal occupation, with protests and boycotts erupting in the West and the Global South. Using their propaganda media channels, the US and Israel are trying to placate the billions who support Palestine, even as the genocide remains ongoing. Israel hopes that its Machiavellian system will demoralise and create an obsequious Palestinian people – people whose screams are silenced – but as always it underestimates their indefatigable spirit which, miraculously, gets stronger with every adversity.

 

The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Monitor or Informed Comment.

Creative Commons License Unless otherwise stated in the article above, this work by Middle East Monitor is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
]]>
Drone program Crashing as Spiritually Damaged Pilots quit in Droves https://www.juancole.com/2015/03/program-crashing-spiritually.html https://www.juancole.com/2015/03/program-crashing-spiritually.html#comments Fri, 06 Mar 2015 05:03:07 +0000 http://www.juancole.com/?p=150810 By Pratap Chatterjee | (Tomdispatch.com) –

The U.S. drone war across much of the Greater Middle East and parts of Africa is in crisis and not because civilians are dying or the target list for that war or the right to wage it just about anywhere on the planet are in question in Washington. Something far more basic is at stake: drone pilots are quitting in record numbers.

There are roughly 1,000 such drone pilots, known in the trade as “18Xs,” working for the U.S. Air Force today. Another 180 pilots graduate annually from a training program that takes about a year to complete at Holloman and Randolph Air Force bases in, respectively, New Mexico and Texas. As it happens, in those same 12 months, about 240 trained pilots quit and the Air Force is at a loss to explain the phenomenon. (The better-known U.S. Central Intelligence Agency drone assassination program is also flown by Air Force pilots loaned out for the covert missions.)

On January 4, 2015, the Daily Beast revealed an undated internal memo to Air Force Chief of Staff General Mark Welsh from General Herbert “Hawk” Carlisle stating that pilot “outflow increases will damage the readiness and combat capability of the MQ-1/9 [Predator and Reaper] enterprise for years to come” and added that he was “extremely concerned.” Eleven days later, the issue got top billing at a special high-level briefing on the state of the Air Force. Secretary of the Air Force Deborah Lee James joined Welsh to address the matter. “This is a force that is under significant stress — significant stress from what is an unrelenting pace of operations,” she told the media.

In theory, drone pilots have a cushy life. Unlike soldiers on duty in “war zones,” they can continue to live with their families here in the United States. No muddy foxholes or sandstorm-swept desert barracks under threat of enemy attack for them. Instead, these new techno-warriors commute to work like any office employees and sit in front of computer screens wielding joysticks, playing what most people would consider a glorified video game.

They typically “fly” missions over Afghanistan and Iraq where they are tasked with collecting photos and video feeds, as well as watching over U.S. soldiers on the ground. A select few are deputized to fly CIA assassination missions over Pakistan, Somalia, or Yemen where they are ordered to kill “high value targets” from the sky. In recent months, some of these pilots have also taken part in the new war in the Syrian and Iraqi borderlands, conducting deadly strikes on militants of ISIL.

Each of these combat air patrols involves three to four drones, usually Hellfire-missile-armed Predators and Reapers built by southern California’s General Atomics, and each takes as many as 180 staff members to fly them. In addition to pilots, there are camera operators, intelligence and communications experts, and maintenance workers. (The newer Global Hawk surveillance patrols need as many as 400 support staff.)

The Air Force is currently under orders to staff 65 of these regular “combat air patrols” around the clock as well as to support a Global Response Force on call for emergency military and humanitarian missions. For all of this, there should ideally be 1,700 trained pilots. Instead, facing an accelerating dropout rate that recently drove this figure below 1,000, the Air Force has had to press regular cargo and jet pilots as well as reservists into becoming instant drone pilots in order to keep up with the Pentagon’s enormous appetite for real-time video feeds from around the world.

The Air Force explains the departure of these drone pilots in the simplest of terms. They are leaving because they are overworked. The pilots themselves say that it’s humiliating to be scorned by their Air Force colleagues as second-class citizens. Some have also come forward to claim that the horrors of war, seen up close on video screens, day in, day out, are inducing an unprecedented, long-distance version of post-traumatic stress syndrome (PTSD).

But is it possible that a brand-new form of war — by remote control — is also spawning a brand-new, as yet unlabeled, form of psychological strain? Some have called drone war a “coward’s war” (an opinion that, according to reports from among the drone-traumatized in places like Yemen and Pakistan, is seconded by its victims). Could it be that the feeling is even shared by drone pilots themselves, that a sense of dishonor in fighting from behind a screen thousands of miles from harm’s way is having an unexpected impact of a kind psychologists have never before witnessed?

Killing Up Close and Personal From Afar

There can be no question that drone pilots resent the way other Air Force pilots see them as second-class citizens. “It’s tough working night shifts watching your buddies do great things in the field while you’re turning circles in the sky,” a drone instructor named Ryan told Mother Jones magazine. His colleagues, he says, call themselves the “lost generation.”

“Everyone else thinks that the whole program or the people behind it are a joke, that we are video-game warriors, that we’re Nintendo warriors,” Brandon Bryant, a former drone camera operator who worked at Nellis Air Force Base, told Democracy Now.

Certainly, there is nothing second-class about the work tempo of drone life. Pilots log 900-1,800 hours a year compared to a maximum of 300 hours annually for regular Air Force pilots. And the pace is unrelenting. “A typical person doing this mission over the last seven or eight years has worked either six or seven days a week, twelve hours a day,” General Welsh told NPR recently. “And that one- or two-day break at the end of it is really not enough time to take care of that family and the rest of your life.”

The pilots wholeheartedly agree. “It’s like when your engine temperature gauge is running just below the red area on your car’s dashboard, but instead of slowing down and relieving the stress on the engine, you put the pedal to the floor,” one drone pilot told Air Force Times. “You are sacrificing the engine to get a short burst of speed with no real consideration to the damage being caused.”

The Air Force has come up with a pallid interim “solution.” It is planning to offer experienced drone pilots a daily raise of about $50. There’s one problem, though: since so many pilots leave the service early, only a handful have enough years of experience to qualify for this bonus. Indeed, the Air Force concedes that just 10 of them will be able to claim the extra bounty this year, striking testimony to the startling levels of job turnover among such pilots.

Most 18Xs say that their jobs are tougher and significantly more upfront and personal than those of the far more glamorous jet pilots. “[A] Predator operator is so much more involved in what is going on than your average fast-moving jetfighter pilot, or your B-52, B-1, B-2 pilots, who will never even see their target,” Lieutenant Colonel Bruce Black, a former Air Force drone pilot says. “A Predator pilot has been watching his target[s], knows them intimately, knows where they are, and knows what’s around them.”

Some say that the drone war has driven them over the edge. “How many women and children have you seen incinerated by a Hellfire missile? How many men have you seen crawl across a field, trying to make it to the nearest compound for help while bleeding out from severed legs?” Heather Linebaugh, a former drone imagery analyst, wrote in the Guardian. “When you are exposed to it over and over again it becomes like a small video, embedded in your head, forever on repeat, causing psychological pain and suffering that many people will hopefully never experience.”

“It was horrifying to know how easy it was. I felt like a coward because I was halfway across the world and the guy never even knew I was there,” Bryant told KNPR Radio in Nevada. “I felt like I was haunted by a legion of the dead. My physical health was gone, my mental health was crumbled. I was in so much pain I was ready to eat a bullet myself.”

Many drone pilots, however, defend their role in targeted killings. “We’re not killing people for the fun of it. It would be the same if we were the guys on the ground,” mission controller Janet Atkins told Chris Woods of the Bureau of Investigative Journalism. “You have to get to [the enemy] somehow or all of you will die.”

Others like Bruce Black are proud of their work. “I was shooting two weeks after I got there and saved hundreds of people, including Iraqis and Afghanis,” he told his hometown newspaper in New Mexico. “We’d go down to Buffalo Wild Wings, drink beer and debrief. It was surreal. It didn’t take long for you to realize how important the work is. The value that the weapon system brings to the fight is not apparent till you’re there. People have a hard time sometimes seeing that.”

Measuring Pilot Stress

So whom does one believe? Janet Atkins and Bruce Black, who claim that drone pilots are overworked heroes? Or Brandon Bryant and Heather Linebaugh, who claim that remotely directed targeted killings caused them mental health crises?

Military psychologists have been asked to investigate the phenomenon. A team of psychologists at the School of Aerospace Medicine at Wright-Patterson Air Force Base in Ohio has published a series of studies on drone pilot stress. One 2011 study concluded that nearly half of them had “high operational stress.” A number also exhibited “clinical distress” — that is, anxiety, depression, or stress severe enough to affect them in their personal lives.

Wayne Chappelle, a lead author in a number of these studies, nonetheless concludes that the problem is mostly a matter of overwork caused by the chronic shortage of pilots. His studies appear to show that post-traumatic stress levels are actually lower among drone pilots than in the general population. Others, however, question these numbers. Jean Otto and Bryant Webber of the Armed Forces Health Surveillance Center and the Uniformed Services University of the Health Sciences, caution that the lack of stress reports may only “reflect artificial underreporting of the concerns of pilots due to the career-threatening effects of [mental health] diagnoses, [which] include removal from flying status, loss of flight pay, and diminished competitiveness for promotion.”

Seeing Everything, Missing the Obvious

One thing is clear: the pilots are not just killing “bad guys” and they know it because, as Black points out, they see everything that happens before, during, and after a drone strike.

Indeed, the only detailed transcript of an actual Air Force drone surveillance mission and targeted killing to be publicly released illustrates this all too well. The logs recorded idle chatter on February 21, 2010, between drone operators at Creech Air Force base in Nevada coordinating with video analysts at Air Force special operations headquarters in Okaloosa, Florida, and with Air Force pilots in a rural part of Daikondi province in central Afghanistan. On that day, three vehicles were seen traveling in a pre-dawn convoy carrying about a dozen people each. Laboring under the mistaken belief that the group were “insurgents” out to kill some nearby U.S. soldiers on a mission, the drone team decided to attack.

Controller: “We believe we may have a high-level Taliban commander.”

Camera operator: “Yeah, they called a possible weapon on the military-age male mounted in the back of the truck.”

Intelligence coordinator: “Screener said at least one child near SUV.”

Controller: “Bullshit! Where? I don’t think they have kids out this hour. I know they’re shady, but come on!”

Camera operator “A sweet [expletive]! Geez! Lead vehicle on the run and bring the helos in!”

Moments later, Kiowa helicopter pilots descended and fired Hellfire missiles at the vehicle.

Controller: “Take a look at this one. It was hit pretty good. It’s a little toasty! That truck is so dead!”

Within 20 minutes, after the survivors of the attack had surrendered, the transcript recorded the sinking feelings of the drone pilots as they spotted women and children in the convoy and could not find any visual evidence of weapons.

A subsequent on-the-ground investigation established that not one of the people killed was anything other than an ordinary villager. “Technology can occasionally give you a false sense of security that you can see everything, that you can hear everything, that you know everything,” Air Force Major General James Poss, who oversaw an investigation into the incident, later told the Los Angeles Times.

Of course, Obama administration officials claim that such incidents are rare. In June 2011, when CIA Director John Brennan was still the White House counterterrorism adviser, he addressed the issue of civilian deaths in drone strikes and made this bold claim: “Nearly for the past year, there hasn’t been a single collateral death, because of the exceptional proficiency, precision of the capabilities that we’ve been able to develop.”

His claim and similar official ones like it are, politely put, hyperbolic. “You Never Die Twice,” a new report by Jennifer Gibson of Reprieve, a British-based human rights organization, settles the question quickly by showing that some men on the White House “kill list” of terror suspects to be taken out have “’died’ as many as seven times.”

Gibson adds, “We found 41 names of men who seemed to have achieved the impossible. This raises a stark question. With each failed attempt to assassinate a man on the kill list, who filled the body bag in his place?” In fact, Reprieve discovered that, in going after those 41 “targets” numerous times, an estimated 1,147 people were killed in Pakistan by drones. Typical was the present leader of al-Qaeda, Ayman al-Zawahiri. In two strikes against “him” over the years, according to Reprieve, 76 children and 29 adults have died, but not al-Zawahiri.

Deserting the Cubicle

Back in the United States, a combination of lower-class status in the military, overwork, and psychological trauma appears to be taking its mental toll on drone pilots. During the Vietnam War, soldiers would desert, flee to Canada, or even “frag” — kill — their officers. But what do you do when you’ve had it with your war, but your battle station is a cubicle in Nevada and your weapon is a keyboard?

Is it possible that, like their victims in Pakistan and Yemen who say that they are going mad from the constant buzz of drones overhead and the fear of sudden death without warning, drone pilots, too, are fleeing into the night as soon as they can? Since the Civil War in the U.S., war of every modern sort has produced mental disturbances that have been given a variety of labels, including what we today call PTSD. In a way, it would be surprising if a completely new form of warfare didn’t produce a new form of disturbance.

We don’t yet know just what this might turn out to be, but it bodes ill for the form of battle that the White House and Washington are most proud of — the well-advertised, sleek, new, robotic, no-casualty, precision conflict that now dominates the war on terror. Indeed if the pilots themselves are dropping out of desktop killing, can this new way of war survive?

Pratap Chatterjee is executive director of CorpWatch. He is the author of Halliburton’s Army: How A Well-Connected Texas Oil Company Revolutionized the Way America Makes War and Iraq, Inc. His next book, Verax, a graphic novel about whistleblowers and mass surveillance co-authored by Khalil Bendib, will be published by Metropolitan Books in 2016.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Rebecca Solnit’s Men Explain Things to Me, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.

Copyright 2015 Pratap Chatterjee

Via Tomdispatch.com

——-
Related video added by Juan Cole:

Wochit General News: “U.S. Drone Attack in Yemen Kills Four Suspected Al Qaeda Militants: Security Sources”

]]>
https://www.juancole.com/2015/03/program-crashing-spiritually.html/feed 5
FAA insists on Regulating Delivery Drones https://www.juancole.com/2014/06/insists-regulating-delivery.html Fri, 27 Jun 2014 05:37:56 +0000 http://www.juancole.com/?p=112593 Via Gas2

The FAA doesn’t think much of Jeff Bezos’ idea to deliver packages to Amazon.com customers using drones. Yesterday the agency issued a proposed ruling that defines such deliveries as part of a “business purpose” and not a recreational or hobby activity. Well, d’uh! TechCrunch reports that the agency said in a footnote to its proposed rule:

If an individual offers free shipping in association with a purchase or other offer, FAA would construe the shipping to be in furtherance of a business purpose, and thus, the operation would not fall within the statutory requirement of recreation or hobby purpose.

Amazon, of course, vows to fight the FAA on this. It is already moving forward with its drone delivery plans via a wholly owned subsidiary called Prime Air. The idea is that customers would place an order online, the nearest Amazon warehouse would load the merchandise aboard a Prime Air drone and within minutes it would be winging its way directly to the customer’s front door. Delivery times measured in hours rather than days should be possible if Bezos’ vision ever becomes reality.

Which may be never. The FAA has authorized test facilities in six statesAlaska, Virginia, New York, Texas, Nevada, and North Dakota – to explore how drones can be integrated into civilian air space currently occupied by commercial and private aircraft, police and fire aircraft, emergency medical flights, news and traffic reporters and others. The idea of itty bitty drones like those depicted in the movie Minority Report flitting around in between all those regular aircraft makes some people more than a little queasy. They worry about mid air collisions between conventional aircraft and pint size drones that are difficult to see. As the government gains experience from the test facilities, it plans to put together a nationwide drone policy by 2020.

But Amazon is not a company that is accustomed to waiting. It will certainly challenge this latest move by the FAA, building on a judge’s ruling in March of this year that struck down a proposed FAA fine against the operator of a radio controlled model airplane who mounted a camera to the fuselage and used it to make videos that he later sold for commercial purposes. At issue was the definition of what a model aircraft is, exactly, and in that case the court sided with the hobbyist. Amazon is trying hard to convince regulators that is it just going to operate its drones as a hobby. I don’t think that argument is gonna fly!

The controversy will keep legions of lawyers busy splitting hairs and parsing paragraphs for years. In the meantime, the dream of having a little drone bring a hot dog and a Bud Light to your seat at your local sports arena is probably not going to become a reality any time soon.

Mirrored from Gas2

——

Related video:

Amazon drones preparing for takeoff despite FAA guidelines

]]>