The separation performance, energy demand, and operating costs of electro-coagulation (EC) are compared to conventional chemical coagulation for oil–water separation using a simulated oil-and gas-produced water matrix. An iron-based chemical coagulant and sacrificial iron electrodes are evaluated. Effluent turbidity, chemical oxygen demand (COD), total organic carbon (TOC), and oil and grease (O&G) removal were determined for various coagulant concentrations and reaction times and current densities. Chemical coagulation produced superior turbidity removal when scaled by the total iron dose. At lower iron doses (<500 mg/L), chemical coagulation yielded better COD, turbidity, and O&G removal. However, chemical coagulation was unable to effectively remove contaminants to meet the offshore discharge limit of 29 ppm O&G. At higher iron doses, EC was more effective at removing COD and O&G. The energy consumption of EC was found to be much higher even when factoring in the energy of production, transporting, and mixing of chemical coagulants, but the overall cost of EC was approximately half the cost of chemical coagulation, and more effective at O&G removal.