I've mentioned before that I often find myself noticing fundamental bugs in the way the human brain works and wishing my brain was better designed.
Here's another one: my brain is often very bad at predicting how it would behave in dangerous or scary situations. It's annoyingly common for me to evaluate several courses of action in advance of an event, decide which one I like best, and then when the time comes to actually commit myself then I discover that the one I'd chosen is terribly scary now that it's actually physically staring me in the face rather than being considered as an abstract strategic puzzle.
If I were designing an ideal intelligence, I would give it a properly working imagination. It would be able to set up a hypothetical situation, put itself into that situation, and then reason exactly as if it were real. It would either be able to temporarily completely suppress the knowledge that the situation wasn't real, or alternatively it would just be able to reliably inhibit that knowledge from impinging on its reasoning processes. In fact, now I've written that either/or, I'm not entirely sure I can robustly define the difference between those two possibilities; but either way, the fundamental architecture of my intelligence would be designed in such a way that if it decided it would react a certain way in a scary situation, you could depend on it being right.