A method for fast estimation of Lipschitz constants for feedforward neural networks

Fast, scalable algorithms estimate neural-network Lipschitz constants for efficient robustness certification.
Technology No. 2024-XU-70766

Qualifying neural networks is fundamental in quantifying advances in machine learning capabilities. In practice, these networks receive input data and are progressively trained in order to refine their accuracy over time. The robustness of a neural network determines its adaptability to various inputs, potential errors, and changes while maintaining its ability to remain consistent in the quality of its outputs. The Lipschitz constant is the quantitative measure of this robustness, essentially measuring how a network's outputs vary given any perturbations among its inputs. This is important information to understand and ensure the integrity of a neural network. Certifying network robustness by deriving an exact Lipschitz constant is a challenging problem, requiring computational capabilities that currently cannot be supported. Therefore, researchers at Purdue University have developed a scalable approach for estimating Lipschitz constants for deep neural networks while reducing the overall computation time. With these methods emerges an exciting possibility for certifying neural network robustness in an efficient manner.

Technology Validation:

The researchers developed two algorithms and implemented them on both randomly generated neural networks and trained neural networks. Both estimation algorithms derived comparable Lipschitz constants with reduced computation time in comparison to traditional methods.

Advantages:

-Reduction in computation time while maintaining accuracy

-Simplified methodology that advances the scalability and efficiency of the estimation process

Applications:

-Evaluating machine learning capabilities and training methods

-Artificial intelligence development and integration

Publication:

Xu, Y., & Sivaranjani, S. (2024). Compositional Estimation of Lipschitz Constants for Deep Neural Networks (Version 1). arXiv. https://doi.org/10.48550/ARXIV.2404.04375

TRL: 4

Intellectual Property:

Provisional-Gov. Funding, 2024-06-05, United States

Utility-Gov. Funding, 2025-05-30, United States

Keywords: Artificial Intelligence, Computer Technology, Lipschitz constant, Machine Learning, Neural Networks

  • expand_more cloud_download Supporting documents (1)
    Product brochure
    A method for fast estimation of Lipschitz constants for feedforward neural networks.pdf
Questions about this technology?