• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Will the future of technology unravel mankinds Pandora's Box?

Epic Beard Man

Bearded Philosopher
There is no denying technology as it is an essential aspect of human life, and it is constantly improving itself with many human innovations which leads to societal advancements. As a consequence to shifts in technology, society changes, and along these changes comes to the changes in which we view our world. Intelligence augmentation and artificial intelligence has the potential to transform the philosophical approach to technology. because of our cognitive limitations, the creation of super-intelligence can enhance these limitations into a new sociological as well as psychological frontier.

Although technology proposes convenience through efficiency, opportunity, communication and medicine, it in opposition leaves us with laziness, a social divide, and becoming quickly becoming obsolete,. With the potential of artificial intelligence will we in essence open up our Pandora's box to our doom or our salvation?

Scholar G. Prisco recently stated:

"Getting things to almost work is much, much easier than getting things to work. Engineers know that even if you do 90 percent of the work in 10 percent of the time, then you will have to spend the remaining 90 percent of the time to do the missing 10 percent of the work. Same, of course, for money.
Which means that 90 percent wasn’t really 90 percent, because it left out all the boring details that take 90 percent of the money and the time — boring details like sustainability, operational robustness, error recovery, fail-safe operations and all that, without forgetting social acceptance, financial and political aspects.
That real AI seems always 20 years away indicates that perhaps we just don’t know enough to estimate the development timeline for something that is actually 200 years away, or more. A good analogy is Leonardo’s flying machines. Leonardo correctly guessed that machines could fly, but the actual development of flying machines took centuries and required different technologies.
I don’t buy the idea of a “post-scarcity” utopia (actually, I don’t buy any utopia). It’s worth emphasizing that, from the perspective of our grandfathers and people in poor regions, today’s developed world is a post-scarcity utopia because nobody is starving to death…”

“Without forgetting social acceptance” is the key. We could transform the entire world rather quickly, but the dislocation would be such that, perhaps, all the positive (that we now define as positive) would be negated."

Thread inspired by:Fruit picking robots to eliminate need for farm workers?
 
Last edited:

TagliatelliMonster

Veteran Member
With the potential of artificial intelligence will we in essence open up our Pandora's box to our doom or our salvation?

That would completely depend on how it is applied. The same goes for all technology, not just super-duper AI.

You can use nuclear technology to build weapons of mass destruction, or you can use it to build hi-tech medical equipment to cure nasty deseases.

Likewise, you can use AI technology to implement oppressive systems or autonomous killing drones, or you can use it to solve hard problems like global warming.

The science, is seperate from the politics.
 

PureX

Veteran Member
As our technical prowess increases, our wisdom does not. This is a recipe for disaster. Like tossing a loaded pistol into a cage full of monkeys.
 
Last edited:
That would completely depend on how it is applied. The same goes for all technology, not just super-duper AI.

You can use nuclear technology to build weapons of mass destruction, or you can use it to build hi-tech medical equipment to cure nasty deseases.

Likewise, you can use AI technology to implement oppressive systems or autonomous killing drones, or you can use it to solve hard problems like global warming.

With AI you have the potential for it to develop in ways we don't understand.

Even very simple things like social media algorithms offering suggestions have had significant unintended consequences, and these negative effects are not easy to remedy.

Far more complex AI systems are going to develop which have a major effect on what news we see, who we are connected to, etc. which will in turn have other AI systems trying to manipulate them.

It is far easier for humans to create than to understand, and unfortunately one of our defining characteristics is hubris.

At least with things like nuclear weapons, they are largely under human control so their harms come from conscious human actions.

Technologies that are to some extent autonomous, like AI, or self-replicating, like the products of gene editing technologies or advanced forms of AI, are a different class of risk and warrant a different level of precaution.


The science, is seperate from the politics.

It never is as it exists in a human context. Tech is only the real world application of tech; we don't live in a textbook.

Science/tech doesn't make us more moral or more rational beings, it simply magnifies the scale at which our stupidity/irrationality or malevolence can cause harm.
 

Brickjectivity

Turned to Stone. Now I stretch daily.
Staff member
Premium Member
There is no denying technology as it is an essential aspect of human life, and it is constantly improving itself with many human innovations which leads to societal advancements. As a consequence to shifts in technology, society changes, and along these changes comes to the changes in which we view our world. Intelligence augmentation and artificial intelligence has the potential to transform the philosophical approach to technology. because of our cognitive limitations, the creation of super-intelligence can enhance these limitations into a new sociological as well as psychological frontier.

Although technology proposes convenience through efficiency, opportunity, communication and medicine, it in opposition leaves us with laziness, a social divide, and becoming quickly becoming obsolete,. With the potential of artificial intelligence will we in essence open up our Pandora's box to our doom or our salvation?

Scholar G. Prisco recently stated:

"Getting things to almost work is much, much easier than getting things to work. Engineers know that even if you do 90 percent of the work in 10 percent of the time, then you will have to spend the remaining 90 percent of the time to do the missing 10 percent of the work. Same, of course, for money.
Which means that 90 percent wasn’t really 90 percent, because it left out all the boring details that take 90 percent of the money and the time — boring details like sustainability, operational robustness, error recovery, fail-safe operations and all that, without forgetting social acceptance, financial and political aspects.
That real AI seems always 20 years away indicates that perhaps we just don’t know enough to estimate the development timeline for something that is actually 200 years away, or more. A good analogy is Leonardo’s flying machines. Leonardo correctly guessed that machines could fly, but the actual development of flying machines took centuries and required different technologies.
I don’t buy the idea of a “post-scarcity” utopia (actually, I don’t buy any utopia). It’s worth emphasizing that, from the perspective of our grandfathers and people in poor regions, today’s developed world is a post-scarcity utopia because nobody is starving to death…”

“Without forgetting social acceptance” is the key. We could transform the entire world rather quickly, but the dislocation would be such that, perhaps, all the positive (that we now define as positive) would be negated."

Thread inspired by:Fruit picking robots to eliminate need for farm workers?
Imagine young people no longer have to beg for work from the titans of industry. The Industrial Revolution has been very, very hard on families and the poor. Now AI is a new revolution that can overturn all of that and return all work and production back to the commoner. That is why you can't let people talk smack about AI, particularly people who don't understand it. Its not the terminator. Its the all in one appliance and ultimate recycling machine that also powers your house. The potential is not the death of the human race, and we already are potentially going to die from lots of things. This is one potential key to preserving the human race.
 

TagliatelliMonster

Veteran Member
With AI you have the potential for it to develop in ways we don't understand.

Only insofar as you allow it to.
It's not like an AI engine geared towards learning about climate, is suddenly going to infiltrate automated factories and turn into a Skynet that starts building terminators..........

Even very simple things like social media algorithms offering suggestions have had significant unintended consequences, and these negative effects are not easy to remedy.

"Social" media are a societal cancer, with or without AI.
I agree though, that AI certainly makes it worse then it already was.

Far more complex AI systems are going to develop which have a major effect on what news we see, who we are connected to, etc. which will in turn have other AI systems trying to manipulate them.

It is far easier for humans to create than to understand, and unfortunately one of our defining characteristics is hubris.

I'm kind of optimistic about that. Eventhough it's late into the game at this point, but we are starting to see a pushback of the practices of these "social" mediums. Facebook is heavily under fire. Privacy concerns find their way into political discourse. Much too late imo, but it's there nonetheless.

I'ld be incredibly aggressive against it. I'ld have no problem pushing sensible laws that would literally decapitate their business models. But that's the problem. There's loads of money involved and the lobbying groups have much much influence.

Nonetheless, awareness is starting to be raised and it's only a matter of time imo.

Having said that, even if politics fail to respond in time, it's quite easy to not be part of the problem. Just stay away from those platforms. More and more people are realising it everyday.


At least with things like nuclear weapons, they are largely under human control so their harms come from conscious human actions.

Technologies that are to some extent autonomous, like AI, or self-replicating, like the products of gene editing technologies or advanced forms of AI, are a different class of risk and warrant a different level of precaution.

I'ld say they rather warrant a serious level of regulation.
The problem though, is that politics is broken. The democratic system is not efficient enough to keep up with these technological advances. This is why law always is behind technological progress.

That's what you get when the most popular gets elected instead of the most qualified.

Our current democratic governmental structure are simply not efficient enough, waaaaay to slow and inherently lack the expertise to properly deal with such things.

Science/tech doesn't make us more moral or more rational beings, it simply magnifies the scale at which our stupidity/irrationality or malevolence can cause harm.
Or the opposite, off course.... magnifying our creativity, intelligence and benevolence.
 

Epic Beard Man

Bearded Philosopher
Imagine young people no longer have to beg for work from the titans of industry. The Industrial Revolution has been very, very hard on families and the poor. Now AI is a new revolution that can overturn all of that and return all work and production back to the commoner. That is why you can't let people talk smack about AI, particularly people who don't understand it. Its not the terminator. Its the all in one appliance and ultimate recycling machine that also powers your house. The potential is not the death of the human race, and we already are potentially going to die from lots of things. This is one potential key to preserving the human race.

But what of economics? What of meritocracy and earning wealth? If AI replaces the human what about earning potential?
 

Epic Beard Man

Bearded Philosopher
If only that were true...

It's like giving several autonomous, adaptive, self-replicating advanced weapons systems to a cage full of monkeys.

But isn't that humanity's doom especially when it comes to military arsenal, meaning having an artificial intelligence control such things?
 

ChristineM

"Be strong", I whispered to my coffee.
Premium Member
AI is here and benefits millions of people each day. A Google search, results provided by AI, buy from Amazon, goods picked, packaged and tracked by AI. Traffic lights are no longer the dumb thing causing a traffic jam, they have AI to do it now. Often medical diagnosis is aided by AI and some medical procedures are performed by AI (still with surgeons supervision), the outcome of AI aided procedures tend to be better.

So far we appear to be using AI for the benefit of the human race.
 

TagliatelliMonster

Veteran Member
But what of economics? What of meritocracy and earning wealth? If AI replaces the human what about earning potential?

It'll be disruptive for sure. Our entire economical model would require a serious overhall.
Technology most certaintly has the capacity of literally transforming society from the ground up.

I like the Star Trek example, to take it to an absurd extreme. They have this machine called a "replicator". You ask for something and the machine literally "materializes" it for you. No matter what it is. A cello, a guitar, a cup of tea heated to exactly 79°C, a juicy steak with french fries,...

This makes all medium-sized consumer objects worthless.
Why would you spend money buying a laptop, if you can just have the machine materialize it for you?

It's extreme, off course, but in a sense I feel it captures the jist of it.

If I imagine a world of implemented automated systems, it's easy to see how loads of things we now pay people for, will no longer be human jobs.

And while such implementations will create new kinds of jobs, I don't think it's gonna weigh up.

If for every 10 jobs that get destroyed, only 1 is created, then we'll have a problem at our hands that will require some serious out-of-the-box solutions. Preferably without risking to end up in some sort of idiocracy.

But I also think this is a reality far into the future, which will gradually unfold. Society will gradually change and adapt to those new realities as well, just like it always has.
 

BSM1

What? Me worry?
As our technical prowess increases, our wisdom does not. This is a recipe for disaster. Like tossing a loaded pistol into a cage full of monkeys.

In that case you must try to be the monkey holding the gun...
 
A better question, imo, is "why on earth would you give an AI control over such things?"

Unfortunately humans are stupid, hubristic and don't learn from experience.

But isn't that humanity's doom especially when it comes to military arsenal, meaning having an artificial intelligence control such things?

It's not just autonomous weapons systems (although countries are spending billions on developing these), it is everything.

Basically everything will be connected to the internet in future, from clothing with sensors to monitor fitness/health and provide augmented reality experiences, everything in your house, cars and transportation systems, power generation, information systems.

This will create almost total surveillance over everything we do and generate enormous quantities of information which will then be used to influence us to do things which 'benefit' a company/society/us/etc.

AI will be part of the control system for all of these, and advanced algorithms will be influencing all aspects of our lives. These will be self-modifying and will adapt in ways we cannot comprehend. Powerful machine learning techniques will focus in ways to influence and manipulate us to do what the algorithm wants us to do (which may not even be what it was designed to influence us to do)

Due to the nature of tech to become dominated by small numbers of successful companies, this will also result in a centralisation of these effects.

When you combine complexity, centralisation and scaling you create very fragile systems, and that's even before considering the threat from malicious actors.

With basic AI like social media (which is really just something that tries to stop you from leaving the site), this has diluted mainstream opinions while magnifying extreme groups and conspiracy theories, creating clickbait and trolling cultures, all of which has damaged the mental health of many people, caused a drop in child vaccinations, growth in flat earther type conspiracies and possibly affected the result of the US election.

One of the problems with this issue is that we have no idea what will exist in the future so we don't yet have any idea what could go wrong, but it's not just "AI with guns" that we should be worried about, but how AI will influence our thoughts, emotions and actions.
 

Brickjectivity

Turned to Stone. Now I stretch daily.
Staff member
Premium Member
But what of economics? What of meritocracy and earning wealth? If AI replaces the human what about earning potential?
As always Economics will remain the dismal science, but maybe cottage industries can make a comeback. All these other questions are things we can't predict, but AI augmenting human labor is a good thing. People used to think that a nation's value was in its gold reserves. That is no longer true, so the things people value can change. One economy gets replaced by another.
 

Epic Beard Man

Bearded Philosopher
As always Economics will remain the dismal science, but maybe cottage industries can make a comeback. All these other questions are things we can't predict, but AI augmenting human labor is a good thing. People used to think that a nation's value was in its gold reserves. That is no longer true, so the things people value can change. One economy gets replaced by another.

I know one thing......

The McDonalds Kiosk is a lot better when I want specific things on my burger.
 

Epic Beard Man

Bearded Philosopher
Unfortunately humans are stupid, hubristic and don't learn from experience.

It's not just autonomous weapons systems (although countries are spending billions on developing these), it is everything.

Basically everything will be connected to the internet in future, from clothing with sensors to monitor fitness/health and provide augmented reality experiences, everything in your house, cars and transportation systems, power generation, information systems.

This will create almost total surveillance over everything we do and generate enormous quantities of information which will then be used to influence us to do things which 'benefit' a company/society/us/etc.

AI will be part of the control system for all of these, and advanced algorithms will be influencing all aspects of our lives. These will be self-modifying and will adapt in ways we cannot comprehend. Powerful machine learning techniques will focus in ways to influence and manipulate us to do what the algorithm wants us to do (which may not even be what it was designed to influence us to do)

Due to the nature of tech to become dominated by small numbers of successful companies, this will also result in a centralisation of these effects.

When you combine complexity, centralisation and scaling you create very fragile systems, and that's even before considering the threat from malicious actors.

With basic AI like social media (which is really just something that tries to stop you from leaving the site), this has diluted mainstream opinions while magnifying extreme groups and conspiracy theories, creating clickbait and trolling cultures, all of which has damaged the mental health of many people, caused a drop in child vaccinations, growth in flat earther type conspiracies and possibly affected the result of the US election.

One of the problems with this issue is that we have no idea what will exist in the future so we don't yet have any idea what could go wrong, but it's not just "AI with guns" that we should be worried about, but how AI will influence our thoughts, emotions and actions.


MW-GD647_skynet_20180213113524_ZH.jpg
 

Epic Beard Man

Bearded Philosopher
It'll be disruptive for sure. Our entire economical model would require a serious overhall.
Technology most certaintly has the capacity of literally transforming society from the ground up.

I like the Star Trek example, to take it to an absurd extreme. They have this machine called a "replicator". You ask for something and the machine literally "materializes" it for you. No matter what it is. A cello, a guitar, a cup of tea heated to exactly 79°C, a juicy steak with french fries,...

This makes all medium-sized consumer objects worthless.
Why would you spend money buying a laptop, if you can just have the machine materialize it for you?

It's extreme, off course, but in a sense I feel it captures the jist of it.

If I imagine a world of implemented automated systems, it's easy to see how loads of things we now pay people for, will no longer be human jobs.

And while such implementations will create new kinds of jobs, I don't think it's gonna weigh up.

If for every 10 jobs that get destroyed, only 1 is created, then we'll have a problem at our hands that will require some serious out-of-the-box solutions. Preferably without risking to end up in some sort of idiocracy.

But I also think this is a reality far into the future, which will gradually unfold. Society will gradually change and adapt to those new realities as well, just like it always has.

Good point
 
Top