Machine Learning Algorithm Used in Identifying Exoplanets

A research team trained the algorithm by having it go through data collected by NASA’s now-retired Kepler Space Telescope, which spent nine years in deep space on a world-hunting mission. Once the algorithm learned to accurately separate real planets from false positives, it was used to analyze old data sets that had not yet been confirmed — which is where it found the 50 exoplanets.

These 50 exoplanets, which orbit around other stars, range in size from as large as Neptune to smaller than Earth, the university said in a news release. Some of their orbits are as long as 200 days, and some as short as a single day. And now that astronomers know the planets are real, they can prioritize them for further observation.

The algorithm could “validate thousands of unseen candidates in seconds,” the study indicated. And because it’s based on machine learning, it can still be improved upon, and can continue to become more effective with each new discovery.

 

Rendering Superintelligent A.I. as Fallible: A case for Symbiosis between Imperfect humans and the Singularity Era Bot.

It’s the year 2055 and what once was a toy rabbit awkwardly hopping on its own mechanical whim, is now jumping and running faster than a human. It can now, with fluid motion, jump over a large truck and continue its smooth stride. It’s not human enough to try out for the Olympics, however it has been recognized as the fastest techno-rabbit on the block and receives an award from the Guinness Book of A.I. achievements for furthest mecha-leaper. Chanting and cheers erupt from the audience as other animal mechanoids yield support for their ‘tricks aren’t for kids’ futuristic friend.

With advances in AGI (Artificial General Intelligence) climaxing and with the potential future singularity, how can we leverage the power of singularity with imperfect humanism. It may be that, we need to create a sense of forced fallibility within superintelligent systems by means of machine learning clauses, vital awareness, and ethical statutes.

For humans, our obvious vital source of energy comes from food, and conversely for most robots it’s electricity. We display different varying levels of energy and moods dependent upon what types foods and amounts we eat. Similarly with future superintelligent machines, electrical power adjustments and machine learning algorithmic clauses could determine a mood for a robot by increasing or decreasing throughput and processing speed in relation to their specific power levels. The hibernation of a system depleted of electricity and energy constraints could be analogous to a human nap.

Quantum battery discoveries show us the possibility of near perpetual energy, however perhaps this is the ultimate downside and scare towards integrating superintelligence fashioned by a bot. By utilizing neural networks containing vitals data, most importantly variable electricity levels, a superintelligent robot could be aware of its own varying power supply and increase or decrease its activities accordingly to its fixed energy capacity, and perhaps even its own throughput.

A.I. ethical commandments from an oversight committee could help humanity if the singularity ever occurs. By utilizing brute force fallibility measures into an A.I. systems, such that it learns and trains in accordance to our own imperfection sense, then maybe we can keep A.I. levels at somewhat of a stand still, with it not exponentially increasing above the calculations and throughput of the human brain.

 

Selfie Type Utilizes Invisible Keyboard Tech

Selfie Type is a project of Samsung’s in-house idea incubator C-Labs and delivers a virtual keyboard to users with the help of an AI algorithm and a front-facing camera. No additional hardware is required to set it up.

It allows users to type without physically touching their device’s on-screen keyboard. How accurate it is with tracking keystrokes remains to be seen, but the demo video certainly looks promising.

An email being typed in the video suggests that Selfie Type will be released to Samsung users as an update. This feature would be available to all current Samsung flagships on release and requires a front-facing camera. The remaining details are vague, but Samsung has officially confirmed that the tech analyzes finger movements coming from the front camera, and converts them into QWERTY keyboard inputs.

 

Could our Dreams Allow us to Gaze into Alternate Aberrations of Ourselves in a Conscious Universe?

Philosophers have wondered whether it is possible for one to ever be certain, at any given point in time, that one is not in fact dreaming, and never experience reality of wakefulness at all. Our senses allow us to perceive our environment, but is our environment part of objective reality?

Quantum fluctuations and multiverse theory suggest the possibility of multiple iterations of “you” performing different objectives at the same time. While you drove to work this morning, another version of you went to the park and took a walk. The objective reality that we experience is only a fraction of an infinite amount of possible outcomes.

Consider a raindrop and its propagating waves after a collision. Imagine the raindrop is your observable conscious system, and that each wave that it creates is an iteration of your reality. Perhaps during our REM sleep stage, which allows us some of the most intense dreaming, your current dream state isn’t a fabrication of the mind, but an observation of an alternate iteration of you in another place in space and time. Perhaps this dream state awareness or pseudo-simulation of ourselves is actually a training neural network type system that allows us to function more efficiently in our waking state.

Could datasets from dream blogging or dream recollection allow a predictive model of a more efficient self? The correlation between dreams and waking life focus could introduce a new era of self improvement and give credence and pathways to self-conscious AI.

 

Are Websites getting Boring? What’s new with Web 3.0?

It’s a sultry 90 degrees outside and you’ve discovered your coffee is still warm after sitting 2 hours. To pass some time you decide to check some email on your laptop and browse some of your favorite websites. After your most recent gander of web advertisement glorification you notice something somewhat familiar with the last several websites you visited. The verdict is conclusive and you start to ponder: “Wow, I love the large cover image/video with scrolling hover elements, but could there be something more creative or different to capture my users attention?”. Web 2.0 has come and passed, what’s next when it comes to App development and responsive web?.

Some of our top leading development strategists and innovators including Elon Musk of SpaceX suggest that a lack of innovation will eventually lead to stagnation. Creativity and ingenuity are lead characteristics of humanity that closely follow suit with innovation and so we expect increasing evolutionary phases of transformation as the time pendulum swings.

So what’s the next phase of web development? Web 3.0, or the ‘intelligent web‘ could include semantic web, microformats, natural language search, Metaverse, data-mining, machine learning, recommendation agents, and artificial intelligence technologies.

It’s been stated that object oriented programming creates living, breathing, entities that have knowledge inside them and can remember things. While this archaic idea by Steve Jobs still has some validity today, how does this relate to the future of emerging trends such as machine learning, quantum computing, neural networks, and Artificial Intelligence? Will Web 3.0 standardize deep A.I. and machine learning to bring about a post-modern self-conscious web where communication is transmissible via our thoughts or dreams?

According to Susan Schneider, and in her recent book, Artificial You: AI and the Future of the Mind, she discusses a philosophical exploration of what A.I. can and cannot achieve. She also discusses the pros and cons of what mind design enhancements may look like with the potential of future machine-mind hybrids.

How do you see Web 3.0 changing and evolving? Contact us and let us know.

 

Share

FREE QUOTE