It is known that in various random matrix models, large perturbations create outlier eigenvalues which lie, asymptotically, in the complement of the support of the limiting spectral density. This thesis studies fluctuations of these outlier eigenvalues of iid matrices $X_n$ under bounded rank and bounded operator norm perturbations $A_n$, namely the fluctuations $\lam(\frac{X_n}{\sqrt{n}}+A_n)-\lam(A_n)$. The perturbations $A_n$ that we consider belong to a large class, where we allow for arbitrary Jordan types and almost minimal assumptions on the left and right eigenvectors. We obtain the joint convergence of the normalized asymptotic fluctuations of the outlier eigenvalues in this setting with a unified approach.