Their churches were attacked.. What do you know about the Jews of Dagestan?

A report by “The” magazine revealed EconomistAbout a new arms race related to the use of artificial intelligence in developing weapons and fighting wars, and the report indicated that artificial intelligence tools have effectively contributed to modern wars.

The report indicated that tools and weapons that can potentially employ artificial intelligence have been used on an increasing scale in places such as Gaza and Ukraine, at a time when the rate and rates of exchange on such types of weapons are rising.

He pointed out that there is demand for the Storm Cloud system, as one of the systems for developing weapons and employing artificial intelligence in military operations and military aircraft maintenance operations.

On the other hand, human rights activists and jurists warn that employing and using artificial intelligence in weapons will make war more deadly and less humane, especially with the increasing possibility of war breaking out between the great powers.

The prowess of armies in wars has come to depend on artificial intelligence as part of technological progress, which pushes the United States and China to enter into a competition for supremacy in this field in order to shape the future global landscape, according to the magazine “The National Interest“.

According to the same magazine, although artificial intelligence has gained great popularity over the past few years, great powers have been researching military applications for artificial intelligence for decades. Since 2014, the United States has been working to build the foundation for integrating artificial intelligence into its military.

A recent study conducted by the Rand Corporation, a think tank, found that artificial intelligence plays an important role in modernization and maintenance of military aircraft, by predicting when those aircraft will need maintenance.

According to a report by The Economist magazine, the US Army uses algorithms to predict when Ukrainian howitzers will need new barrels, while some armies use artificial intelligence to help evaluate and qualify soldiers.

Russia and Ukraine have developed software to create drones capable of flying to the target independently, even if interference disrupts the pilot’s control of the drone.

Several countries are competing to develop, manufacture and possess “hypersonic weapons” that rely on artificial intelligence, which can change the rules of the game in wars, due to their ability to attack quickly, the possibility of launching them from great distances and their ability to evade most air defense means, according to a report. To the Wall Street Journal, on September 16.

Hypersonic weapons are capable of attacking very quickly, can be launched from great distances, evade most air defenses, and can carry conventional explosives or nuclear warheads.

However, experts point out that the technology and targeting algorithms still face many of the same problems that self-driving cars face, such as crowded streets and mysterious objects.

In the opinion of The Economist, artificial intelligence can process more than just phone calls or photos.

The magazine notes that the British Navy announced last March that its mine detection unit had completed a year of experiments in the Arabian Gulf, using a small self-driving boat that could search for mines on the seabed and alert ships or other units about them.

In most cases, artificial intelligence identifies a signal in the noise or an object in the chaos, and determines whether it is a truck or a tank, a fishing vessel or a submarine. However, identifying human combatants may be more complex.

Last April, Israeli media claimed that the Israeli military was using an artificial intelligence tool known as Lavender to identify thousands of Palestinians as targets, with human operators only cursoryly examining the system’s output before ordering strikes.

But the Israeli army responded by saying that Lavender was “merely a database intended to compare intelligence sources.”

The Economist magazine report quotes Clint Hainaut, a retired American general, and Mick Ryan, a retired Australian general, as saying that the Ukrainian “JIS Arta” program, which collects data on Russian targets for artillery devices, “can already create lists of potential targets.” According to the leader’s priorities.

According to Generals Hinote and Ryan, one of the reasons for Russia’s advances in Ukraine in recent months is that Russian C2 systems are getting better at processing information from drones and sending it to fighters and weapons.

The US Air Force recently asked the RAND Corporation to evaluate whether artificial intelligence tools could provide information and options for a space fighter dealing with a potential threat to a satellite. The result is that AI can actually recommend high-quality guidance.

The Economist magazine report notes that the Pentagon’s research agency is working on a strategy and planning program to produce guidance for leaders during war.

“A lot of the techniques that are used in the strategy and planning project didn’t exist two to five years ago,” says Eric Davis, the agency’s program director.

Legal experts and activists point out that the increasing role of artificial intelligence in war is fraught with risks, as “current systems cannot recognize people’s hostile intentions.”

Noam Lobel, from the University of Essex, says, “These programs cannot distinguish between a short soldier carrying a real gun, and a child carrying a gun made for entertainment and play, or between a wounded soldier lying on a rifle, and a sniper ready to fire from a sniper rifle.”

Stuart Russell, a computer expert, explains: “Artificial intelligence tools can be deceived by printing or designing marks on non-military objects, such as lighting poles, for example, to make the weapon appear to be a tank.”

Experts warn against the errors of weapons that rely on artificial intelligence, and point out that the errors that result from them can be devastating and terrible, and they demand that humans not be excluded from the shooting decision.

The report quoted Palmer Luckey, founder of Anduril, one of the companies involved in the Storm Cloud system, as saying, “It’s really tempting, but you can break international law.”

Luckey acknowledged that AI would be less important in the “dirty, messy, terrible” mission of urban warfare, similar to the Gaza war.

Last April, the Secretary-General of the United Nations, António Guterres, indicated that reports about the Israeli army’s use of artificial intelligence in its war on Gaza raise “concern” that “life and death decisions” will become linked to “calculations made by algorithms.”

A text of the United Nations Human Rights Council condemned “Israel’s use of explosive weapons with a wide-ranging effect in the populated areas of Gaza” and the use of artificial intelligence “to assist in the military decision-making process,” considering that this “may contribute to international crimes.”

ظهرت في الأصل على www.alhurra.com

Leave a Comment