Normalizing Flows (NF) are bijective maps from the data to a Gaussian (normal) distribution or viceversa. In contrast to other generative models they are lossless and provide data likelihood via the Jacobian of the transformation. I will first present a novel Sliced Iterative NF (SINF), which is based on Optimal Transport theory, achieving state of the art results in density estimation for small data samples and in anomaly detection applications in high energy physics. I will discuss its applications to Bayesian Inference and to Global Optimization problems, where it enables new methods of sampling and optimization, which have the potential to accelerate standard Monte Carlo Markov Chains. In the second half of the talk I will present a Normalizing Flow for data structures with Rotational and Translational Equivariance (TRENF), which can be used for generative modeling and likelihood analysis of cosmological data. By training the data likelihood on the posterior this approach enables near optimal cosmological likelihood analysis, where information from all the data is optimally combined into a single number (likelihood) as a function of cosmological parameters. This method provides uncertainty quantification via the full posterior of cosmological parameters, which paves the way for a complete and optimal cosmological data analysis with Normalizing Flows.