By: Daniela Tajonal Flores

This picture was taken in a park in Roseville. I was going on a walk and noticed how beautiful the green trees and plants were looking. Hearing the water flow was relaxing.


By: Daniela Tajonal Flores



By: Calvin Westin

The story about Prometheus, where fire stolen by a mythical figure from the gods became a symbol of the destructive and bad side of progress. While giving humankind a tool for making civilization and ruling the world, the fire itself remained uncontrollable. Mary Shelley used this same idea while creating the book of Victor Frankenstein, whose creation had made something too powerful to stay under its creator’s control, similar to fire in ancient times. And now, as the development of artificial intelligence technology progresses at a fast speed, it seems humans are once again following this pattern. By publishing these powerful artificial intelligence algorithms, companies like OpenAI are committing theft of fire, therefore risking disruption of domains of human intelligence.
People are also worried that our current safety rules aren’t enough to keep up. In Chapter 5 of Shelley’s book, Victor Frankenstein’s sadness creates an unsettling relationship. Having brought the monster into being, Frankenstein goes on, “This was then the reward of my curiosity; and I became myself capable of bestowing existence on whom I had created. Perhaps a corpse would be reanimated; galvanism had give tokens of such a wonder.”
This story is a major warning for us today. Scientists like Geoffrey Hinton believe that computers will soon beat humans in terms of dealing with data analysis and handling data. Once developers lose control over the results produced by the machines, there will be serious consequences following after that.
Consequently, it appears to be time that IT professionals start actually managing and executing their responsibilities more effectively. New tech isn’t an excuse to do sketchy stuff in this field, so experts working on AI really need to follow some strict safety rules while they’re doing their research.
In the end, the people who create anything have full moral responsibility for it.