AI Strategy and Concepts Bibliography

WIKINDX Resources

Richins, D., Doshi, D., Blackmore, M., Nair, A. T., Pathapati, N., & Patel, A., et al. 2020, Missing the forest for the trees: End-to-end ai application performance in edge data centers. Paper presented at 2020 IEEE International Symposium on High Performance Computer Architecture (HPCA). 
Added by: SijanLibrarian (2020-08-12 10:00:22)   Last edited by: SijanLibrarian (2020-08-12 10:02:34)
Resource type: Proceedings Article
BibTeX citation key: Richins2020
Email resource to friend
View all bibliographic details
Categories: Artificial Intelligence, Computer Science, Data Sciences, Engineering, General, Innovation, Military Science
Subcategories: 5G, Big data, Cloud computing, Command and control, Cyber, Deep learning, Edge AI, Internet of things, JADC2, Machine learning, Mosaic warfare, Networked forces, Neural nets
Creators: Blackmore, Daguman, Dobrijalowski, Doshi, Illikkal, Long, Nair, others, Patel, Pathapati, Richins
Collection: 2020 IEEE International Symposium on High Performance Computer Architecture (HPCA)
Views: 60/71
Views index: 17%
Popularity index: 4.25%
Abstract
Artificial intelligence and machine learning are experiencing widespread adoption in the industry, academia, and even public consciousness. This has been driven by the rapid advances in the applications and accuracy of AI through increasingly complex algorithms and models; this, in turn, has spurred research into developing specialized hardware AI accelerators. The rapid pace of the advances makes it easy to miss the forest for the trees: they are often developed and evaluated in a vacuum without considering the full application environment in which they must eventually operate. In this paper, we deploy and characterize Face Recognition, an AI-centric edge video analytics application built using open source and widely adopted infrastructure and ML tools. We evaluate its holistic, end-to-end behavior in a production-size edge data center and reveal the "AI tax" for all the processing that is involved. Even though the application is built around state-of-the-art AI and ML algorithms, it relies heavily on pre-and post-processing code which must be executed on a general-purpose CPU. As AI-centric applications start to reap the acceleration promised by so many accelerators, we find they impose stresses on the underlying software infrastructure and the data center's capabilities: storage and network bandwidth become major bottlenecks with increasing AI acceleration. By not having to serve a wide variety of applications, we show that a purpose-built edge data center can be designed to accommodate the stresses of accelerated AI at 15% lower TCO than one derived from homogeneous servers and infrastructure. We also discuss how our conclusions generalize beyond Face Recognition as many AI-centric applications at the edge rely upon the same underlying software and hardware infrastructure.
  
wikindx 6.2.2 ©2003-2020 | Total resources: 1447 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA) | Database queries: 74 | DB execution: 0.13648 secs | Script execution: 0.15009 secs