# Imaging with quadratic data efficiently

In this talk I will discuss the problem of coherent imaging with quadratic data. I will explain how to formulate this problem as a non-convex optimization problem by minimizing the rank of a positive semidefinite decision matrix whose rank is always 1. Since the rank minimization problem is NP-hard one can replace it by the minimization of the nuclear norm of a decision matrix, which makes the problem convex and solvable in polynomial time. This method guarantees exact recovery and works very well when the size of the problem is small. The bottleneck for inversion is, however, the number of unknowns that grows quadratically with the size of the problem, resulting in a prohibitive computational cost as the dimension increases. Hence, I will discuss another method for imaging with quadratic data whose computational cost grows linearly with the size of the problem. The keystone is the use of a Noise Collector that absorbs the data component that comes from the off- diagonal elements of the decision matrix. This allows us to reduce the dimensionality of the problem making it competitive with respect to the usual problem of imaging with complete data that include phases. This method provides exact support recovery when the images are sparse and the data are not too noisy. Even more, it guarantees no false positives for any level of noise.