COMET-Modul Security and Safety for Shared Artificial Intelligence (S3AI)

Term: 1/2020 - 12/2023 (36 Monate)

Partner: SCCH, Hagenberg

Currently, there is an emerging paradigm shift in artificial intelligence (AI) to support the sharing of deep machine learning artifacts to build powerful collaborative AI ecosystems. This development marks the innovative shift from data sharing to sharing the hidden distributed representation inside deep models. This development has versatile effects on the scope of AI applications and related business models, notably in terms of reducing development costs by reusing pretrained models and saving data acquisition efforts. However, the even more far-reaching effect comes from unlocking the trapped value in data across company borders and, thereby overcoming limitations in the availability of annotated data for high quality customized services and opening up new ways of collaborative AI-based business models between different parties. On the other hand, however, this emerging technology imposes new challenges above all in terms of safety and security. Therefore, the overall goal of S3AI is to provide the foundations required to build secure and safe shared artificial intelligence systems. As key scientific-technological challenges S3AIfocuses on methods for privacy preservation, protection against adversarial attacks and guarantees for the system’s intended behavior. The research within S3AI tackles these challenges by a model intrinsic security and safety by design approach according to the principle that security needs to be thought through from start to finish. We will build on novel model architectures based on distributed deep transfer learning by exploiting mathematical concepts from computational algebraic geometry and regularization.