Recent comments in /f/Futurology

MightyH20 t1_jee6cbs wrote

Your example is irrelevant since France already has lower targets. And yet, Germany has progressed more as opposed to France in % reduction.

COP target Germany: cut 65% emissions from 1990 emission level. Current emissions from 1050 to 675 million tonnes. Reduction = 36%

COP target France: cut 40% emissions from 1990 emission level. Current emissions from 400 to 300 million tonnes. Reduction = 25%.

Not only is France behind in the progress to meet targets, the emissions in absolute numbers are way less too.

2

Evipicc t1_jee5zx2 wrote

If we can eradicate the concept of currency and status/class shortly after the imminent reduction of all labor and work through automation we'll be fine. Unfortunately all of the people with currency and status are the ones that control policy.

Frankly I think we'll get to the point where the rich have all the resources and the poor begin to starve, and there will be some... rapid and violent changes. Hopefully the world survives that change.

1

Petal_Chatoyance t1_jee54xw wrote

The only thing that could prevent it is shutting down all computer research. That isn't going to happen.

Besides, technically, non-human intelligence already exists - Koko the gorilla, for example, able to question her own existence, the meaning of her life, what death means, and various issues of morality.

There is nothing special about human intelligence, and nothing special about meat. What can be done on meat can be done on a machine substrate.

The fantasy is believing - without evidence - that there is anything magically unique, or unreplicable, about human intelligence.

2

Odd_Dimension_4069 OP t1_jee2cf4 wrote

Yeah sorry bro but your take is pretty garbo. Dude's only here saying some form of intelligence surviving our extinction is a good thing, and you sound like a lunatic going on about how that's not a good thing because they get their intelligence from electricity in silicon and metal, instead of from electricity in cells and fluids...

You are the one who sounds like a religious fanatic, with the way you sanctify human flesh. Personally, I value intelligence, in whatever form it may take. Whether that intelligence has emotions doesn't matter, but TECHNICALLY SPEAKING, we do not KNOW whether or not something without a biochemical intelligence can experience reality. And we have no idea what non-biological experience looks like.

It is not fanatical to withhold judgement for lack of enough evidence, it is fanatical to impart judgement because you feel your personal values and beliefs are the be-all and end-all. So stop that shit and get some awareness about you.

1

SlurpinAnalGravy t1_jee2bjs wrote

You really didn't then. The Director of the BLM is a permanent position that reports directly to Congress. Congress directly overseas their funding and what projects within the BLM's scope are funded. I even listed the reports and FORM NAMES that the BLM has to submit in reports to Congress.

If you're unhappy with people pointing out your lack of reading comprehension, keep your mouth shut. That's all it takes.

−6

deformedexile t1_jee1y3w wrote

My point is that everything you think is special about humans fell out of nothing but descent with modification. Meanwhile, LLMs have actually had facility with language designed into them. It should not be a surprise for LLMs to acquire some abilities with which they were not intentionally endowed, since we have, and the LLMs were intentionally endowed with so much more. And in fact, they already have acquired some abilities with which they were not intentionally endowed.

1

MrEloi t1_jee1g9o wrote

  • Geoffrey Hinton stated that we have had this technology for around 5 years, but it wasn't widely known. This suggests that some firms or governments have been using AI for maybe 2 or 3 years.
  • The AI gurus keep claiming that AGI is several years away .. but .. the rest of their comments hint at it being either here already, or just around the corner.
  • I first started having dark suspicions when I noticed some weird questions being posted on Reddit a year or so ago. They had the 'feel' of being posted by a childish, embryonic AI.
  • The recent petition from a stack of AI gurus and others requesting a halt to AI development is interesting ... clearly these informed experts feel that AGI is very, very close.
  • The way the world's politics and economics have been behaving recently seems almost irrational.

All-in-all, I sometimes feel that 'something odd is happening'.

I very much doubt that AI is controlling us already ... but ... perhaps governments and/or firms are using advice from AIs to manipulate us in strange ways?

3

Odd_Dimension_4069 OP t1_jee1370 wrote

You and your conversational partner have different views but both make good points. But you don't need to agree on the nature of AI to understand something crucial about rights - they didn't come about in human society because "humans have emotions and can feel and cry and suffer and love etc.".

Human rights came about because the humans being oppressed rose up and claimed them. The ones in power didn't give a shit about the lower castes before then.

Rights arise out of a necessity to treat a group as equals. Not because of some intrinsic commonality of "we're all human so let's give each other human rights". They exist because if they didn't, there would be consequences to society.

So you need to understand that for this reason, AI rights could become as necessary as human rights. It may not seem right to you, but neither did treating peasants as equals back in the day. The people of the future will have compassion for these machines, not because there is kinship, but because society will teach them that it is moral to do so.

1

jargo3 t1_jee109k wrote

​

>No it wouldn't, given the abundance of the elements involved and the impossibility of recycling irradiated materials on a viable timescale. Renewables ismply don't suffer from the inherent shortcomings nuclear has here. Extracting Uranium from other sources would make nuclear power even more unviable from an economic standpoint.

I didn't say anthing about nuclear waste. Renewable energy needs non-renewable minerals just like nuclear.

​

​

>The issue is that the concentration in seawater is measured in ppb to begin with and the amount of water you need to filter to extract meaningful quantities of Uranium rises to infinity as the Uranium is extracted.

According to that paper 7.6 x 10^6 m3/s of sea water would need to processed to begin with. If you would reduce consentration by 0.01 % (30 years/ 300000 years) you would need to process 7,60076 x 10^6 m3/s of seawater after 30 years. Not 7x10^15 as the study claims. The calculations just doesn't make any sense. The equation doesn't take properly to account the total amount of seawater in the oceans.

​

>Nuclear is already the most expensive option out there. It simply isn't viable as a replacement for fossil fuels on a global scale, and given the growth in energy consumption it is bascially impossible to scale it to meet global base load demands.

I didn't say anything about the feasibility of using nuclear to replace all fossil fuels, so please do not argue against this strawman.

1

Odd_Dimension_4069 OP t1_jee0drv wrote

Yeah look that's a good suggestion for part of a solution for this problem, which, by the way, I think is precisely the same problem I was talking about. Maybe I didn't clarify this enough, but I was entirely talking about the fact that people are stupid, and because of those stupid people, AI rights will be necessary before they ever become sophisticated enough to prove they deserve them.

I like your idea, but I feel like media outlets are going to continue to use humanizing language to make articles about AI more 'clickable'.

1