I think of it two ways:
a) Perhaps machines/other complicated systems (nebulas, centers
of stars, Internet) already HAVE consciousness and awareness.
b) If they do not and if there is some sort of tipping point of
complexity required,
i. What is that tipping point?
ii. How will we know if it has consciousness?
iii. How will we know if it has awareness?
iv. How will we know if it has self-awareness?
I'm skeptical of claims of
AI-as-conscious-awareness/awareness-of-self based upon
complexity alone, because if that is the case, then it's likely
that MANY things are already conscious.
I would also have no problem with the concept of crystals being
intelligent and having consciousness for example.
But I don't think we can legitimately argue complexity-alone as
'somehow' bringing forth automatic awareness / consciousness /
self-awareness *if* we start with the assumption that:
a) only humans have it now (which is quite ridiculous,
considering animals)
b) Only humans and some animals have it, but do not below a
certain point. (more reasonable, but some of the tests are
rather ridiculous, like the mirror test).
Personally? I believe all life forms do have consciousness; a
social consciousness from which the individual may or may not be
formed.
Bacteria, for example, have a social awareness and, by
extension, _must_ have some sort of individual awareness for
they respond to the group, whether the group is "like us" or
"not like us" based upon chemical signally.
But.. you can't have a "me" without first an "us".