diff --git a/src/chapters/introduction.tex b/src/chapters/introduction.tex index 449e6e4..254a055 100644 --- a/src/chapters/introduction.tex +++ b/src/chapters/introduction.tex @@ -7,7 +7,7 @@ Composed of multiple layers of interconnected nodes that mimic a network of neur Consequently, \acp{dnn} make it possible to tackle many new classes of problems that were previously beyond the reach of conventional algorithms. However, the ever-increasing use of these technologies poses new challenges on hardware architectures, as the energy required to train and run these models reaches unprecedented levels. -Recently published numbers approximate that the development and training of Meta's LLaMA model over a period of about five months consumed around $\qty{2638}{\mega\watt\hour}$ of electrical energy and caused a total emission of $\qty{1015}{tCO_2eq}$ \cite{touvron2023}. +Recently published numbers approximate that the development and training of Meta's LLaMA model consumed around $\qty{2638}{\mega\watt\hour}$ of electrical energy over a period of about five months and caused a total emission of $\qty{1015}{tCO_2eq}$ \cite{touvron2023}. As these numbers are expected to increase in the future, it is clear that the energy footprint of current deployment of \ac{ai} applications is not sustainable \cite{blott2023}. In a more general view, the energy demand of computing for new applications continues to grow exponentially, doubling about every two years, while the world's energy production only grows linearly, at about $\qty{2}{\percent}$ per year \cite{src2021}, which is shown in \cref{plt:enery_chart}.