Maximum likelihood estimation (MLE) is influential because it can be easily applied to generate optimal, statistically efficient procedures for broad classes of estimation problems. Nonetheless, the theory does not apply to modern settings—such as problems with computational, communication, or privacy considerations—where our estimators have resource constraints. The thesis will introduce a modern maximum likelihood theory (through generalization of local minimax theory) that addresses these issues, focusing specifically on procedures that must be computationally efficient or privacy-preserving. To do so, I first derive analogues of Fisher information for these applications, which allows a precise characterization of tradeoffs between statistical efficiency, privacy, and computation. To complete the development, I also describe a recipe that generates optimal statistical procedures (analogues of the MLE) in the new settings, showing how to achieve the new Fisher information lower bounds.