Such a value system would prevent an intelligent machine from prioritizing its own existence over the existence of a human. The only way, therefore, for an intelligent machine to start killing would be if the creators of this intelligence do not properly implement this value system.
So, should the singularity not spawn a race of intelligent machines bent on the destruction of mankind, what could occur? There are a variety of theories, ranging from humanities' "merging with [the machines] to become super-intelligent cyborgs, using
computers to extend our intellectual abilities the same way that cars
and planes extend our physical abilities" to using the intelligence to "treat the effects of old age and prolong our
life spans indefinitely" to even "scan[ing] our consciousnesses into
computers... [to] live inside them as software, forever, virtually" (Grossman). The
Singularity 2045 organization predicts that by the year 2045, the singularity will have occurred and will have made the world into a perfect utopia.
Because of the vary nature of the idea of the singularity, however, it is impossible to predict what would occur should the singularity event come to pass. However, perhaps more likely than being the end of humanity, it will become a the new Golden Age for humanity, pushing us forward towards greater acheivement and prosperity.