Posts by Tags

Adversarial Attack

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Adversarial Training

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Aritifial Intelligence

Artificial Intelligence

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Compute

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Data Centres

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Deep Learning

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Energy

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

GPU

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Machine Learning

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Machine Learning Safety

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Neural Networks

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Paper Summary

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Python

Research

Machine Learning Safety

6 minute read

Published:

After taking an Advanced Artifical Intelligence course, I realised that while modern AI models achieve very highly in terms of benchmark performance, they have a massive, invisible blind spot. We tend to trust these so called “Black Box” systems with increasingly critical taks from Self-Driving Vehicles to Autonomous Robotics and Medical Diagnosis - assuming that if they have a high accuracy, they are also “intelligent”.

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

Space AI Data Centres

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Space Technology

Space-based AI Data Centres

6 minute read

Published:

After taking a deep dive into the computational infrastucture demands of modern Artificial Intelligence, I realised that while our models are scaling exponentially, the physical infrastructure required to run them is hitting a massive bottleneck. We tend to trust and assume that these so called “Hyperscale” data centers to process everything from every day user LLM query prompts to Autonomous Vehicle Networks assuming that our current power grids infrastructure can just endlessly accomodates them out of the box.

Transformers

Is Attention Really All You Need?

6 minute read

Published:

In 2017, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uzkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin published the Research Paper “Attention Is All You Need”, which introduced the Transformer model architecture, but is Attention really all you need? In this post, I’ll summarise the key ideas in this Research Paper and share my takeaways of whether or not “Is Attention Really All You Need”?

XOR