at one job, they complained that I gave TOO MUCH INFORMATION in
  my bug reports. Too much?* TOO MUCH?!* How can you solve a bug
  if you're missing info?* Gotta know context. Sometimes the bug
  isn't where you think it is.* After all... if it was an OBVIOUS
  clear and easy bug... somebody probably would have spotted it
  already. It's fun to be the first though. The best cause for
  bugs?* Video memory.* Bad video memory.* Nobody ever looks
  there.** I've seen that problem from the days of Hercules
  monochrome amber screens all the way through to
  super-deluxe-duper-3D-rendering-the-Universe-in-a-box video
  cards. One of my favorites?* Laptop didn't work right anymore
  because the SOLDER CONNECTIONS to the video chip MELTED from
  games utilizing the video memory for matrix transforms... and
  the solder didn't see THAT level of heat coming... melty
  solder... loose connection... laptop? No workie no more. Sorry -
  the GPU.. the GPU.. not the video memory... gpu got hot.* The
  gpu utilizes the video memory for higher level operations
  because the gpu is capable of a type of multidimensional
  mathematics that regular CPU+math coprocessors can't.* Turning
  2D into 3D is hard to do when all you have is the equivilent of
  a piece of paper and you're trying to do origami with a
  toothpick and a straw as your only utensils.* [toothpick =
  electron, straw = electron hole] but I don't wanna get into my
  complaints about the flat layout of memory.* Drives me nuts.*
  Memory needs to be in a 3D box to allow for cross-connections
  easily along with a little analog uncertainty built into the
  logic systems... but we'll get there.* we're still building
  computers based on old models from 50 years ago anyway.