Being that this is a computer, it's in a department I'm
  comfortable and familiar with. My experience with human rights
  is mostly limited to a youth rights activist thing I did back in
  1990-1997 on the early Internet - it spawned a few more lobbying
  type groups, but my interest was more freedom of
  expression/creativity/education rather than voting rights,
  parenting rights, etc.

  Anyway - being that it's a computer... there's a lot of people
  involved. This tree illustrates the everyday issues with making
  anything.

  A flaw in any part of the process and you end up with a
  half-assed product. This is the norm rather than the exception;
  problems with robot bomb sniffers were notorously awful in the
  beginning stages. Nobody was listening to anybody.

  Anyway, assuming it makes it to a compromise where everybody is
  happy enough with the product (in this case, a flying computer
  that follows the instructions given to it ahead of time by the
  users, the engineers, the programmers, etc.

  So, is there a human in charge? Yes. There is *many* many humans
  in charge, and they all play a part in making something like
  this work according to plan.

  Or at least try to. The drone they end up with will be
  all-too-human, with all of the flaws programmed fully into it.

  How they will use it? That's the territory you're talking about.