What's new

Featured PAF Center of Artificial Intelligence and Computing

AI and computing will be the part and parcel of near future warfare.

Situation awareness, fully automated or semi-automated command and control, unmanned weapon delivery systems like remote controlled or AI controlled tanks, aircraft and ships/submarines are the near future realities of war.

The Heron Systems' AI algorithm has performed better than US ace pilot in simulated dogfights.

AI vs. Human Fighter Pilot: Here's Who Won the Epic Dogfight
AlphaDogfight Trials Final Event


AFINITI Pair Better is an AI product which focuses on human behavioral patterns which are used to pair customer and employee for better sales and services.

Yes this is one peaceful, productive, efficient and profitable use of AI.

But AI usage in war is a debatable issue.:undecided:

Should we fully authorize AI to make critical decisions and execution during war based on situation awareness or it should only provide all possible solutions and their risk??????

PAF initiative is purely focusing on military use of AI and computing.

Some of the respected members underestimate the true potential of our Pakistani youth due to their inadequate education.
Unfortunately our youth are not being provided with the right opportunities and they are being discouraged to take initiatives by corrupt and illiterate officials and political leaders.

I am not proud of the fact that world's first computer virus was created by Pakistani!!!!
This is one angle to gauge the potential of these youth, If they are given positive opportunities, I am very much hopeful that they will out perform many... Insha Allah

The highlighted parts only.

The reason AI pilot was able to defeat the human, is because the USAF has tonnes and tonnes of
data from previous flights.
They digitize each and everything, over thousands and thousands of flight hours, in all different scenarios.
That is what is used by the AI algorithm to train it self.
During the training phase, it's learning is checked and all negative inferences are removed.

Now, do you think PAF is ready to make active usage of AI ?

The second highlighted comment you made... look at the graduates coming out now,
the lack of mathematical and analytical skills, our outdated curriculum and teaching methods all
tell us that there will NOT be any miracle.

See the comments on this thread, of people unwilling to accept the truth, nor respecting facts at hand.
 
I don't work in the IT Sector. I do, however, have lots of clients who do. The ones which are good companies and bad companies vary in their internal culture. One thing is constant. The bad companies, moan about "quality of graduates", the good ones have proper training and apprenticeship programs and culture of mentorship.
FFS, the university is designed to take the average lad and lass off the street and make them reasonably learned in their chosen field. You should not expect University graduates to have anything more than a broad classroom-based understanding of the field.

Newly minted Doctors, lawyers, accountants and Engineers all need to take several years of practical hands-on work until they are deemed qualified. In every profession, a lot of stuff needs to be done by **doing**.
No one expects them to be experts right out of medical and law school and ACCA and Engineering.
 
PAF, PA and PN does have a pool of specialist officers who work on the R&D/Strat org side. Airforce officers along with civilian faculty do work on fields like ML, DSP apart from teaching in unis. I know a few who taught my sibling (although the quality may vary).

This is not going to be for civilian R&D.

This most probably would be similar to below i.e Joint AI center. Collaboration is key. I hope they don't build their own " Derh eent ki masjid " and collaborate with others who are working in same field in NUST and different strat orgs.


With reference to my last post w.r.t collaboration.


Check out this job at RevolveAI: https://www.linkedin.com/jobs/view/2010984299

 
I think we must not think of AI in terms of human vs. machine fights as yet. There are tens of other applications of AI in aviation. AI can make simulations far more realistic and accurate. It can help building in next generation flight simulators. Apart from that, AI can really help in designing FCS, weapons design research. Main issue in AI is algorithms and their training. Building expandable neural networks and train them with accurate data so they can predict various scenario in their respective application. In near future, the real currency will be these AI algorithms.
 
PAF opens artificial intelligence and computing centre
The Newspaper's Staff Reporter 28 Aug 2020
Facebook Count
Twitter Share

0
ISLAMABAD: Pakistan Air Force (PAF) achieved yet another milestone by inaugurating ‘Centre of Artificial Intelligence and Computing’ here on Thursday.
Chief of the Air Staff Air Chief Marshal Mujahid Anwar Khan was the chief guest on the occasion who formally inaugurated the centre. Speaking at the ceremony, Mujahid Anwar Khan said establishment of the centre was a landmark initiative in the evolutionary journey of PAF, which would lead to the artificial intelligence research and development in both civil and military spheres.
He said technology had altered the characteristics of warfare in the 21st century and the vision of establishing the centre was to harness the potential of artificial intelligence and its integration in PAF’s operational domain.
Earlier, Air Marshal Aamir Masood, the deputy chief of the air staff (training), gave an overview of the new centre.
ARTICLE CONTINUES AFTER AD

The event was attended by former air chiefs, principal staff officers and senior serving and retired PAF officers.
Published in Dawn, August 28th, 2020

 
A good decision!! I am pretty sure the Turkish "folks" below will hopefully get ample inputs from their Pak counterparts to hone up their "intelligence" vis-à-vis the air warfare.....
1598724161601-jpeg.665040

1598724182583-jpeg.665041
 

How does AI transform aerial combat?
Q&AI series motion
Read other chapters from Q&AI: Conversations on artificial intelligence.
In times when aerial combat was a major deciding factor in war, a fighter pilot with five or more victories in close-range combat (also called dogfighting) was considered an ace. Now DARPA, the research and development arm of the U.S. military, wants AI to accumulate victory ratios of 50-to-1 or more through its Aerial Combat Evolution (ACE) program. DARPA and several contractors including Dynetics, a subsidiary of Leidos, hope the multi-phased ACE program will demonstrate AI can outperform human pilots reliably starting with this month’s AlphaDogfight Trials, a tournament-style contest anticipated as one of the next great AI competitions between man and machine.

Kevin Albarado, a senior engineer at Dynetics and the Chief Engineer on the program, said dogfighting is surprisingly well-suited for algorithms because it’s a bounded task with defined goals and measurable outcomes. He believes the advanced AI used on ACE could create a significant imbalance in favor of the U.S. because of its ability to process huge amounts of data from the various sensors on modern military aircraft in order to decide and act on split-second decisions.

This sort of autonomy is the technology backbone that will enable the Defense department’s vision of Mosaic Warfare, which relies on many diverse pieces to confuse and overwhelm the adversary. In this type of combat human pilots become cockpit-based commanders, orchestrating the larger battle while trusting AI to pilot their own plane. By networking unmanned platforms together as a system of systems, assets that are far more expendable than human pilots, these commanders can amass forces more easily and affordably to present greater complications to the enemy. DARPA compares this new role to a football coach calling plays based on the players on the field, a vision that requires trust in their various skills. To learn more we welcome Albarado.

Why train AI to dogfight when conventional wisdom says these types of battles are obsolete? What makes this use case so relevant?

Albarado:
Dogfighting represents the pinnacle of air combat. Even if we don’t expect it to be heavily utilized in the future, it still represents the most difficult and stressing combat environment a pilot can face. It requires a high cognitive workload. It’s a highly dynamic scenario. The situation can turn in an instant. If you can competently dogfight against increasingly competent and complex adversaries without killing yourself or your wingmen, you can be trusted to fly any other air combat mission a commander might ask of you. In a similar vein, employing AI against increasingly skilled adversaries is primarily focused on building trust that AI can handle high-intensity combat.


Kevin Albarado headshot

Dogfighting represents the most difficult and stressing combat environment a pilot can face.
Kevin Albarado, Sr. Aerospace Engineer


What is your team’s primary role on the ACE program?
Albarado:
Our primary role is to develop the AI that enables pilots to apply their tactical expertise to swarms of vehicles as a commander in a battle management role. If you’re a pilot who can delegate the more mechanical aspects of flying your plane to a machine, what are you then capable of doing? Now you can oversee other aircraft and significantly multiply your effectiveness in battle. In fact, we are aiming for kill ratios as high as 30-to-1 and even 50-to-1.

As commander, you don’t decide in detail what exactly each plane should do, but rather you give them very high level objectives to autonomously carry out. Dynetics’ role on ACE is to scale the single aircraft autonomy algorithms up to handle collaboration among a large force in very complex scenarios while achieving a high kill ratio. Our job is to make a competent enough AI for the pilot to stay engaged and manage the battle while the battle is going on.

This requires the pilot’s trust for the AI to expertly handle piloting, thereby freeing up the cognitive workload of the pilot to focus on dynamic battle management, where human creativity, interpretation of intent, and other legal, moral, and ethical decisions are best made by humans.
Air Combat Evolution

Dogfighting represents the pinnacle of air combat, making it an ideal use case to test our most advanced AI. (Image from DARPA)What’s the biggest challenge in building this trust?

Albarado:
The biggest builder of trust is for the warfighter to have experience with the autonomous system, to interact with it and see that it’s going to perform how it is expected to perform. Deviating from these expectations reduces trust in any system, especially if it leads to poor outcomes. So the biggest challenge is making a competent AI that’s sufficient even in the more complex scenarios, but retains a high degree of effectiveness under simper scenarios. As AI starts to push into more complex and sophisticated domains, it’s really important to systematically build trust over progressively complex scenarios so that humans have sufficient time and evidence to gain and retain trust. Of course, we’re pushing the envelope with DARPA, but doing it in such a way as to be a convincing pathfinder for future AI efforts.

Q: How do you measure or quantify trust?
Albarado:
Our ability to keep the cockpit-based commander’s attention on the human-machine interface (HMI) depicting the larger air battle is part of how you measure how much he or she trusts the AI flying the plane. It would be akin to getting in a self-driving car and having a computer screen playing a movie. The more you’re engaged in that movie, the more you trust that car to drive itself. If instead you are looking at the road or monitoring what the car is doing, you likely don’t trust the autonomous car.

Q: How do human pilots learn to dogfight? Will AI learn in a similar way?
Albarado:
When a human pilot normally learns to dogfight you don’t just get in the cockpit of a fighter jet and start dogfighting. You build up to it, starting with takeoff and landing. Then you learn how to get into proper position against an easy, slow, straight-flying target. Once you master that, you’ve built enough trust in your leadership to move the needle up a little more. Perhaps next, you perform the same mission, only this time against a target that’s maneuvering. Once you master that, you move up and learn how to fly with your wingman. You scale from there with 2 vs. 2 battles. Then you progress to 4 vs. 4 and so on. As a human, if you can prove time and again you can handle those scenarios, your commanding officer is going to trust you and your wingman to get the job done in a wider range of scenarios. The overarching hypothesis of the ACE program is that we can build trust with AI in the same manner. So it’s AI that is now in the cockpit as the rookie pilot, having to prove its trustworthiness to the skilled human pilots in the actual seat.

Q: What’s the primary AI technique you’re using in the Dynetics phase of the ACE program?
Albarado:
We’re using a lot of reinforcement learning, which is not unlike trying to get your kids to achieve small objectives you know will lead to some bigger objective you’re trying to get at, like learning to put a few toys back in the bin, as a part of eventually being able to clean an entire room on their own, or at least have the skill to clean their rooms, regardless of how many times they have to be told to do so. At its core, reinforcement learning shares a foundation with traditional machine learning techniques. For a typical machine learning problem, you have a big data set with inputs and desired outputs, and your goal is to develop a model to predict the outputs given the inputs. Reinforcement learning is the same problem, only we don’t know the data yet. Through trial and error, we develop policy models that dictate actions (the output) given a current state (the input) to maximize some reward.

Q&AI: CONVERSATIONS ON ARTIFICIAL INTELLIGENCE
VIEW MORE INSIGHTS
AUTHOR
Brandon Buckner

Brandon BucknerSenior Editorial Manager
Brandon is a writer and content marketer based in the Washington, D.C. area. He loves to cover emerging technology and its power to improve society.
POSTED
August 5, 2020
ESTIMATED READ TIME
4 minutes
AUTHOR
Brandon Buckner
TAGS
Q&A
Artificial Intelligence
 
The highlighted parts only.

The reason AI pilot was able to defeat the human, is because the USAF has tonnes and tonnes of
data from previous flights.
They digitize each and everything, over thousands and thousands of flight hours, in all different scenarios.
That is what is used by the AI algorithm to train it self.
During the training phase, it's learning is checked and all negative inferences are removed.

Now, do you think PAF is ready to make active usage of AI ?

The second highlighted comment you made... look at the graduates coming out now,
the lack of mathematical and analytical skills, our outdated curriculum and teaching methods all
tell us that there will NOT be any miracle.


See the comments on this thread, of people unwilling to accept the truth, nor respecting facts at hand.

Can you quote any example that any armed services of Pakistan specially PAF opened a facility without considering its resources and outcome?

If AFINITI Pair Better can work on AI algorithm for most complex human behavioral pattern in a small office at Lahore, launching a successful product and selling it to various foreign firms, then why PAF will not be ready to incorporate AI in their systems?

You are generalizing the quality of output from our education system. There are many young men and women having better mathematical and analytical skills then the corrupt idiots sitting in the power corridor.

Young kids not even graduates, while playing video games create their own AI algos......

Did you ever experienced working with young aviation engineers at US? They are no better than our Pakistani graduates. But they are getting opportunities while our young ones are unfortunately being denied.
 
The highlighted parts only.

The reason AI pilot was able to defeat the human, is because the USAF has tonnes and tonnes of
data from previous flights.
They digitize each and everything, over thousands and thousands of flight hours, in all different scenarios.
That is what is used by the AI algorithm to train it self.
During the training phase, it's learning is checked and all negative inferences are removed.

Now, do you think PAF is ready to make active usage of AI ?

The second highlighted comment you made... look at the graduates coming out now,
the lack of mathematical and analytical skills, our outdated curriculum and teaching methods all
tell us that there will NOT be any miracle.

See the comments on this thread, of people unwilling to accept the truth, nor respecting facts at hand.
I agree things do not look good. I do know of a few kids who have made it big in computing. The problems of employment in appropriate industrial setup with a comprehensive pay check is important. One of the chaps who moved back from UK with a PHD in-encryption set up the whole encryption for banking as-well as-NADRA in Pakistan. He also had some involvement in the defence-related programmes in Isloo but now has come-back to UK. The point you have made-about the carriculum as-well as training is understandable but we do have to start from some where. The involvement of local youth will make or break the venture. A maths degree per se is not going to help but the-will to apply learnt knowledge into experience is so important.
My own son is finishing his degree in Maths. However the most important training that he got was use of various language to gather data efficiently in an industrial experience year in a couple of companies. However, I dare say he will be no good to the defence related industry as the applications are totally different. So this institute will only work if appropriate people are brought in to train local graduates in various fields and then allowed to run wild to see what they can produce. The military setup is not an efficient one for garnering that sort of talent.
A
 
for successful AI investment in hardware for maintaining data centers is necessary otherwise if we use foreign hardware then it will compromise our AI capability
 
Instead of partnering with academia and allies such as Turkey and China, why’s Pakistan again taking the most treacherous path? @Bilal Khan (Quwa)
 
Instead of partnering with academia and allies such as Turkey and China, why’s Pakistan again taking the most treacherous path? @Bilal Khan (Quwa)
hmm...I think we would've needed our own entity regardless. Even to collaborate with Turkey and China, we still have to bring something to the table in terms of expertise, projects, etc. CENTAIC will help towards that, and we can also scale the infrastructure out to create more Pakistani entities (ideally privately owned) for AI work in other areas.
 
It's Already Too Late - Elon Musk

"Your phone is already extension of yourself. You are already a cyborg!"
 
Perhaps you forget exceptions like Pasban IT. Or so many kids who became Microsoft Certified Professionals, so there are people who are very bright in this field. there is somewhat appreciation among educated circles. We need ambitious young people, pay them well and give them incentive. Someone who has already spent a good part of their life might not get what ambition I am talking about. World is changing sir, its time we do too. I saw the same anguish and hopelessness in you that I see in my Father. Pakistan is way behind its time. The Bureaucrats and Politicians will die. Its my job to not let their children rule us again. So wish me luck🤞.
Well put,
 
With 2 decades of professional experience ( a few with PAF, PA) , if I told you that this will remain only a center
if I told you that we do not have the mathematical background
If i told you we do not have the data sources,
nor do we have the culture even appreciation for AI,

would you believe me ?
Just one thing I'd like to add without taking away from the accurate challenges you've listed:
PAF has thousands of hours of flight data from the JF-17. When I say flight data I mean that the JF-17 logs every damn thing digitally and PAF has all of this data. This is unprecedented for PAF. Furthermore, this data is just lying around doing nothing. PAC has approached some folks I know to develop flight models for the JF-17 using this data (how feasible this is, is another debate). I suspect one of the reasons for setting up this center may be to finally utilize all of this data that we have (JF-17 flying since 2007 but not sure how far the logs go back). A low-hanging fruit is predictive maintenance.

I may be wrong but I haven't seen an ACMI pod on the JF-17 and the reason might as well be that it doesn't need one since it logs all of that data automatically. However, this is just speculation on my part.
 

Latest posts

Back
Top Bottom