The way that data is represented has no meaningful impact on the \\\"philosophy\\\" of a computer beyond the difference between digital and analog.
still gives too much to the \\\'computers are scary\\\' crowd. \\\"the difference between digital and analog\\\" is usually taken to mean that computers/software/AI don\\\'t get nuances and only deal in black-and-white (e.g. Matrix-style kill-all-humans flips straight to love-all-humans).
[i]Whereas[/i] even my washing machine uses fuzzy logic, that allows for e.g. 15% cold + 45% hot = \\\'quite warm\\\'.
Probably not helped by the abysmally lazy AIs kludged into most FPS.