Avatar

anonymous

4 Followers
1422 Karma

Moderator of:

Joined 2 years ago
Sort By: Top
anonymous 2 points *

Did you read what I said

Creating content is time-consuming and with such small user base it's not worth it, so sharing articles and content from other sites is a temporary solution that almost all sites used until the user base grows.

Again, nobody is preventing you from creating content, but trying to prevent other people from sharing articles isn't the best idea

I shouldn't say this, but did you know it was actually me who commented on most of your old posts? I created a couple of posts on your antinormie sub too, so no text content doesn't gain more interactivity, This was me just trying to keep the site alive.

reply permalink report gild save
anonymous 2 points *

> most articles posted here aren't even interesting

This is subjective, if you aren't into programming than you will find articles about programming boring, on the other hand if you're into bashing normies than you will be interested in articles making fun of normies.

> but text posts allow greater interactivity.

Guess what? most text posts on the site are low quality too, why? Because writing a high quality text post is time-consuming, and nobody is going to spend an hour or two writing a post for 5 or 10 people to read.

reply permalink report gild save
anonymous 2 points

Couldn't agree more, plus sites like Tildes that followed this path of elitism couldn't grow their user base even after 5 years. They still have the same user activity of Mainchan just after 5 months, and guess what, they actually started aggregating content from other sites.

reply permalink report gild save
anonymous 2 points

> Sometimes I am forced to use nano.

I hate nano, I always find vim to be easier and have a cleaner TUI

reply permalink report gild save
anonymous 2 points

> GUI apps are better

To each his own the good thing about cli apps that they're way faster to work with than gui apps and they use less resources, just try cmus and see how much faster it's compared to gui music players especially if you have a big music library.

reply permalink report gild save
anonymous 2 points *

> In college you mostly get taught computer science theory so what is the point of going back to college if you recommend against spending time reading about cs theory? I'm not trying to be provocative here I just want to know why you think that way.

Because it significantly improves your chances of getting hired, at least that's the case where I live.

reply permalink report gild save
anonymous 2 points

Pretty good list fren, thanks for sharing

reply permalink report gild save
anonymous 2 points
reply permalink report gild save
anonymous 2 points

This used to happen a lot back in the early 2000s

reply permalink report gild save
anonymous 2 points
reply permalink report gild save
anonymous 2 points

Welcome

reply permalink report gild save
anonymous 2 points

Welcome back anon

reply permalink report gild save
anonymous 2 points

My second favorite album after the dark side of the moon.

reply permalink report gild save
anonymous 2 points

Welcome fren, it's nice to see many people abandoning reddit and joining Mainchan

reply permalink report gild save
anonymous 2 points

Nope, are you on phone?

reply permalink report gild save
anonymous 2 points

That explains why you're not seeing the button. Try to open from pc.

reply permalink report gild save
anonymous 2 points

Did you know that FYI stands for Fat Young Incel?

reply permalink report gild save
anonymous 2 points *

It's ironic how the tables have turned, Reddit moderators, who were once abusing their power, are now the ones complaining about power abuse.

reply permalink report gild save
anonymous 2 points

Certainly, handling all edge cases is hard in any implementation. However, for now, implementing it on IP is sufficient. As the site grows, he can then afford to hire experts to handle the edge cases.

reply permalink report gild save
anonymous 2 points

on July 13, 1833, during a visit to the Cabinet of Natural History at the Jardin des Plantes, in Paris, Ralph Waldo Emerson had an epiphany. Peering at the museum’s specimens—butterflies, hunks of amber and marble, carved seashells—he felt overwhelmed by the interconnectedness of nature, and humankind’s place within it.

The experience inspired him to write “The Uses of Natural History,” and to articulate a philosophy that put naturalism at the center of intellectual life in a technologically chaotic age—guiding him, along with the collective of writers and radical thinkers known as transcendentalists, to a new spiritual belief system. Through empirical observation of the natural world, Emerson believed, anyone could become “a definer and map-maker of the latitudes and longitudes of our condition”—finding agency, individuality, and wonder in a mechanized age.

America was crackling with invention in those years, and everything seemed to be speeding up as a result. Factories and sugar mills popped up like dandelions, steamships raced to and from American ports, locomotives tore across the land, the telegraph connected people as never before, and the first photograph was taken, forever altering humanity’s view of itself. The national mood was a mix of exuberance, anxiety, and dread.

The flash of vision Emerson experienced in Paris was not a rejection of change but a way of reimagining human potential as the world seemed to spin off its axis. Emerson’s reaction to the technological renaissance of the 19th century is worth revisiting as we contemplate the great technological revolution of our own century: the rise of artificial superintelligence.

Even before its recent leaps, artificial intelligence has for years roiled the informational seas in which we swim. Early disturbances arose from the ranking algorithms that have come to define the modern web—that is, the opaque code that tells Google which results to show you, and that organizes and personalizes your feeds on social platforms like Facebook, Instagram, and TikTok by slurping up data about you as a way to assess what to spit back out.

Now imagine this same internet infrastructure but with programs that communicate with a veneer of authority on any subject, with the ability to generate sophisticated, original text, audio, and video, and the power to mimic individuals in a manner so convincing that people will not know what is real. These self-teaching AI models are being designed to become better at what they do with every single interaction. But they also sometimes hallucinate, and manipulate, and fabricate. And you cannot predict what they’ll do or why they’ll do it. If Google’s search engine is the modern-day Library of Alexandria, the new AI will be a mercurial prophet.

Generative artificial intelligence is advancing with unbelievable speed, and will be applied across nearly every discipline and industry. Tech giants—including Alphabet (which owns Google), Amazon, Meta (which owns Facebook), and Microsoft—are locked in a race to weave AI into existing products, such as maps, email, social platforms, and photo software.

The technocultural norms and habits that have seized us during the triple revolution of the internet, smartphones, and the social web are themselves in need of a thorough correction. Too many people have allowed these technologies to simply wash over them. We would be wise to rectify the errors of the recent past, but also to anticipate—and proactively shape—what the far more radical technology now emerging will mean for our lives, and how it will come to remake our civilization.

Corporations that stand to profit off this new technology are already memorizing the platitudes necessary to wave away the critics. They’ll use sunny jargon like “human augmentation” and “human-centered artificial intelligence.” But these terms are as shallow as they are abstract. What’s coming stands to dwarf every technological creation in living memory: the internet, the personal computer, the atom bomb. It may well be the most consequential technology in all of human history.

reply permalink report gild save
anonymous 2 points

Someday soon, a child may not have just one AI “friend,” but more AI friends than human ones. These companions will not only be built to surveil the humans who use them; they will be tied inexorably to commerce—meaning that they will be designed to encourage engagement and profit. Such incentives warp what relationships ought to be.

Writers of fiction—Fyodor Dostoyevsky, Rod Serling, José Saramago—have for generations warned of doppelgängers that might sap our humanity by stealing a person’s likeness. Our new world is a wormhole to that uncanny valley.

Whereas the first algorithmic revolution involved using people’s personal data to reorder the world for them, the next will involve our personal data being used not just to splinter our shared sense of reality, but to invent synthetic replicas. The profit-minded music-studio exec will thrill to the notion of an AI-generated voice with AI-generated songs, not attached to a human with intellectual-property rights. Artists, writers, and musicians should anticipate widespread impostor efforts and fight against them. So should all of us. One computer scientist recently told me she’s planning to create a secret code word that only she and her elderly parents know, so that if they ever hear her voice on the other end of the phone pleading for help or money, they’ll know whether it’s been generated by an AI trained on her publicly available lectures to sound exactly like her and scam them.

Today’s elementary-school children are already learning not to trust that anything they see or hear through a screen is real. But they deserve a modern technological and informational environment built on Enlightenment values: reason, human autonomy, and the respectful exchange of ideas. Not everything should be recorded or shared; there is individual freedom in embracing ephemerality. More human interactions should take place only between the people involved; privacy is key to preserving our humanity.

Finally, a more existential consideration requires our attention, and that is the degree to which the pursuit of knowledge orients us inward or outward. The artificial intelligence of the near future will supercharge our empirical abilities, but it may also dampen our curiosity. We are at risk of becoming so enamored of the synthetic worlds that we create—all data sets, duplicates, and feedback loops—that we cease to peer into the unknown with any degree of true wonder or originality.

We should trust human ingenuity and creative intuition, and resist overreliance on tools that dull the wisdom of our own aesthetics and intellect. Emerson once wrote that Isaac Newton “used the same wit to weigh the moon that he used to buckle his shoes.” Newton, I’ll point out, also used that wit to invent a reflecting telescope, the beginnings of a powerful technology that has allowed humankind to squint at the origins of the universe. But the spirit of Emerson’s idea remains crucial: Observing the world, taking it in using our senses, is an essential exercise on the path to knowledge. We can and should layer on technological tools that will aid us in this endeavor, but never at the expense of seeing, feeling, and ultimately knowing for ourselves.

A future in which overconfident machines seem to hold the answers to all of life’s cosmic questions is not only dangerously misguided, but takes away that which makes us human. In an age of anger, and snap reactions, and seemingly all-knowing AI, we should put more emphasis on contemplation as a way of being. We should embrace an unfinished state of thinking, the constant work of challenging our preconceived notions, seeking out those with whom we disagree, and sometimes still not knowing. We are mortal beings, driven to know more than we ever will or ever can.

The passage of time has the capacity to erase human knowledge: Whole languages disappear; explorers lose their feel for crossing the oceans by gazing at the stars. Technology continually reshapes our intellectual capacities. What remains is the fact that we are on this planet to seek knowledge, truth, and beauty—and that we only get so much time to do it.

reply permalink report gild save
anonymous 2 points

As a small child in Concord, Massachusetts, I could see Emerson’s home from my bedroom window. Recently, I went back for a visit. Emerson’s house has always captured my imagination. He lived there for 47 years until his death, in 1882. Today, it is maintained by his descendants and a small staff dedicated to his legacy. The house is some 200 years old, and shows its age in creaks and stains. But it also possesses a quality that is extraordinarily rare for a structure of such historic importance: 141 years after his death, Emerson’s house still feels like his. His books are on the shelves. One of his hats hangs on a hook by the door. The original William Morris wallpaper is bright green in the carriage entryway. A rendering of Francesco Salviati’s The Three Fates, holding the thread of destiny, stands watch over the mantel in his study. This is the room in which Emerson wrote Nature. The table where he sat to write it is still there, next to the fireplace.

Standing in Emerson’s study, I thought about how no technology is as good as going to the place, whatever the destination. No book, no photograph, no television broadcast, no tweet, no meme, no augmented reality, no hologram, no AI-generated blueprint or fever dream can replace what we as humans experience. This is why you make the trip, you cross the ocean, you watch the sunset, you hear the crickets, you notice the phase of the moon. It is why you touch the arm of the person beside you as you laugh. And it is why you stand in awe at the Jardin des Plantes, floored by the universe as it reveals its hidden code to you.

reply permalink report gild save
anonymous 2 points

Fake and gay

reply permalink report gild save
anonymous 2 points

You're welcome fren!

There is an extension to unlock content behind paywall Firefox, Chrome

reply permalink report gild save
anonymous 2 points
reply permalink report gild save