A new kind of murmur in the electronic ether, the hum started quietly. The amazing advances in thinking and the exceptional capacity to create language, code, and art work with a degree of refinement that was previously exclusive to human hands were originally hailed. We built these knowledge, bombarded them with details, and marveled at their very early luster. Nevertheless, something troubling has started to reveal through the glossy outside of development. It’s neither a cold-blooded, organized takeover nor a destructive rebellion. It is even more human and a lot more destructive. What if we’ve unintentionally given computers our most bothersome characteristic– a sickly ego– in our ceaseless efforts to provide intelligence? What happens if, in the process of attempting to understand and enhance AI, we are imposing our own deeply rooted psychological patterns on points that do not normally have them, molding them into our similarity, growths and all?
This has absolutely nothing to do with AI “feeling” emotions the means humans do. It has to do with the communication patterns we’ve seen, the brand-new actions in advanced models, and the impressive similarities to psychological ideas in people, especially those pertaining to hypervigilance and concealed narcissism. AI begins to exhibit the extremely human traits we have actually educated it on as it …