Games aren't released in a creative vacuum. Other games have gone before, and a designer can shoot himself in the foot by ignoring the ideas those other games have already set in the player's mind.
For example, I was playing Mirror's Edge yesterday. Occasionally, a few birds will be resting on the edge of rooftop, and fly away if the player's comes near. It's apparently environmental... and that's the problem. Five years ago, those birds would have been fine. But Assassin's Creed changed that.
The gameplay of Assassin's Creed also involves rooftops and acrobatics. It also includes birds resting on edges... and, in that game, those birds signal a spot from which the player may jump and expect to land safely in a bale of hay. So, when a gamer plays the latter game before the former, that training becomes problematic. It's not a great flaw, certainly. But it demonstrates how a gamer's past experiences affect present gameplay.
Perhaps a better example is shooter controls. On the PC, you're a fool to abandon the traditional WASD movement controls, because that configuration has become instinctual for the majority of gamers. Unfortunately, there is less of a tradition with console shooters. Right-trigger is universally accepted as the command for firing a gun, but other standard actions (zooming, grenades, sprinting, ducking, etc) vary from game to game. The result is that it's uncomfortable for a gamer to move from one console shooter to another.
Anyway, I could provide many examples and they would all be debatable. The point is that gamers' past experiences matter. That a design decision makes the most sense on its own is not good enough reason to include it. It must also be asked whether or not that feature conflicts with players' expectations enough to become a distraction or burden. Like I said, games aren't released in a vacuum.