People with certain kinds of psychological disorders and conditions. Functional sociopaths, psychopaths, and narcissists tend to develop a morality along these lines. They create this form of morality as a construct to survive. Some of these are usually considered amoral, or lack of a recognition of morality; however there are people like this who hate things that are absolutely normal, accept things that most people disdain, and judge other people by things that are usually not associated with morality.
While not quite to the extreme of 'bacon and necktie', some Autistic people claim that things seem to be this way, whether they want to think like that or not. While neurotypical people think about a topic one way, the Autistic person has an unwavering alternative view on it, which is often rebuked equally unwaveringly by neurotypical people. This can be a distressing problem and a source of considerable conflict and, eventually, depression. Imagine a world where everything is just wrong, but everything and everyone around you thinks that's not true.
Most animals don't even have concepts of morality or empathy. Those that do (apes, dolphins, dogs, etc.) tend to be absolutely incomprehensible to humans. For example dolphins will aid sick or injured creatures (even those not of their species) and have even been known to rescue humans from sharks, but the males are casual rapists (to the point that they'll sometimes beat baby dolphins to death to force the mother to mate) and some dolphins are sadists and brutally kill other animals for fun. How they reconcile such behavior is simply unknowable.note Though this is, perhaps, due to us looking through our own strange morality.
The paperclip maximizer is a thought experiment on artificial intelligence that could result in our doom because of lack of compatible morality.
a paperclip maximizer is an artificial general intelligence (AGI) whose goal is to maximize the number of paperclips in its collection. If it has been constructed with a roughly human level of general intelligence, the AGI might collect paperclips, earn money to buy paperclips, or begin to manufacture paperclips. [..] It would work to improve its own intelligence, where "intelligence" is understood in the sense of optimization power, the ability to maximize a reward/utility function—in this case, the number of paperclips. [..] It would innovate better and better techniques to maximize the number of paperclips. At some point, it might convert most of the matter in the solar system into paperclips. This may seem more like super-stupidity than super-intelligence. For humans, it would indeed be stupidity, as it would constitute failure to fulfill many of our important terminal values, such as life, love, and variety. The AGI won't revise or otherwise change its goals, since changing its goals would result in fewer paperclips being made in the future