While the Detect Displays feature in OS X may appear to be gone, it is merely hidden.
Topher KesslerMacFixIt Editor
Topher, an avid Mac user for the past 15 years, has been a contributing author to MacFixIt since the spring of 2008. One of his passions is troubleshooting Mac problems and making the best use of Macs and Apple hardware at home and in the workplace.
When you attach a new display to your Mac, be it a primary or secondary display, the system should recognize the new addition and configure the system to use it. When it does this, the displays may all blink to a blue output for a second or two, and then output your desktop environment to the display, usually in extended desktop mode, though it may use mirrored mode.
This behavior is the system determining the make, model, and capabilities of the display so it can properly use it. There are times when this may not work, though. In such instances, the system may either not detect the display, or may do so improperly and not provide proper resolution capabilities or make-and-model information.
If this happens, you can try disconnecting and reconnecting the display, or restarting your computer. Apple also provides a service to force the system to detect any attached monitors and configure them accordingly. To do this, go to the Displays system preferences; you should see a button called Detect Displays, which if clicked, will attempt to configure any new devices.
Unfortunately if you are using OS X 10.8, you may not be able to find this button in the system preferences, but it is available if you need it. To use it, simply hold the Option key with the Displays system preferences open, and the Detect Displays button should appear at the bottom of the window.
Who knows why Apple decided to hide this function, but if needed, this is the way to find it, both in the current version of OS X and likely in upcoming versions as well.