Daily Archives: July 31, 2022

AI art tool Midjourney has all the answers to ‘what if’ – The Indian Express

Posted: July 31, 2022 at 9:13 pm

Inspired by the recently released images of the universe by NASA, the first prompt I fed into the Artificial Intelligence (AI) tool of research lab Midjourney was a spaceship surrounded by galaxies. The result, as pictured below, was an image of a vessel suspended in space that seems to reflect the cosmos around it pretty much true to the prompt.

For Midjourneys founder David Holz, a powerful aspect of generative AI is its ability to unify with language, where we can use language as a tool to create things. In simple terms, generative AI uses commands from the user to create novel images based on the dataset it has learnt from different sources over time.

The rise of text-to-image generation has also raised philosophical questions over the definition of an artist.

British mathematician Marcus du Sautoy argues in his book, The Creativity Code (Art and Innovation in the Age of AI), 2019, Art is ultimately an expression of human free will and until computers have their own version of this, art created by a computer will always be traceable back to a human desire to create. He states that if we were to create a mind in a machine, it would perhaps offer a glimpse into its thoughts. But we are still a long way from creating conscious code, du Sautoy concludes.

Similarly, Holz notes, Its important that we dont think of this as an AI artist. We think of it more like using AI to augment our imagination. Its not necessarily about art but about imagining. We are asking, what if. The AI sort of increases the power of our imagination.

Midjourney allows its users to feed in their prompts on its Discord server and then generates four images akin to the text. The user can choose to explore more variations and upscale the perfect fit to a higher quality image. The bot entered open beta last month, giving users a certain number of free trials to bring their imaginations to life. The images generated can also be minted into NFTs, for which, until recently, Midjourney charged royalties.

Its a giant community of almost a million people who are all making images together, dreaming and riffing off each other. All of the prompts are public and everybody can see each others images thats pretty unique, Holz tells indianexpress.com.

Holz co-founded Leap Motion, a hand-tracking motion capture user-interface company, in 2010, and was featured in the Forbes 30 under 30 list of 2014. He now runs a small self-funded research and design lab, Midjourney, which is exploring a bunch of diverse projects, including the AI visualisation tool, with 10 other colleagues.

Elaborating on the response received by the AI bot, Holz says, A lot of people are very happy and find using the product a deeply emotional experience. People use it for everything from a project to art therapy. There are people who have always had things in their mind but were unable to express it before. Some people have conditions like aphantasia, where the mind cant visualise things, and they are now using the bot to visualise for the first time in their life. Theres a lot of beautiful stuff happening.

The bot also takes care to prevent the misuse of the platform to generate offensive images. The community guidelines urge users to refrain from using prompts that are inherently disrespectful, aggressive, or otherwise abusive as well as generate adult content or gore. Midjourney also makes use of moderators who watch out for people violating the policies and give them a warning or ban them. It also has automated content moderation where certain words are banned on the server. The AI, too, learns from user data, Holz explains. If people dont like something, it generates less of that.

I chanced upon the Midjourney bot during a cursory glance through my Twitter feed, where I saw user psychedelhics renditions of a somewhat post-apocalyptic Delhi.

Having previously dabbled with AI bots like Disco Diffusion and Craiyon, an interesting aspect of discovering Midjourney was looking at how different AIs would respond to the same texts. The pictures below show the results generated with the same prompt, city during monsoon rains, by Midjourney, Disco Diffusion, a free-to-use AI tool hosted by Google Colab, and Craiyon, formerly known as DALL-E mini.

While Craiyon throws up relatively realistic images, Disco Diffusion shows surreal, impressionistic results, and Midjourney sits somewhat in the middle of the two.

According to Holz, Midjourney can be understood as a playful, imaginative sandbox. The goal is to give everybody access to that sandbox, so that everyone can understand whats possible and where we are as a civilisation. What can we do? What does this mean for the future?

Holz dismisses fears that AI is here to replace humans or their jobs. When computer graphics was invented, there were similar questions will this replace artists? And it hasnt. If anything, computer graphics makes artists more powerful, he says.

Holz adds, Whenever we see something new, theres a temptation to try and figure out if its dangerous and we treat it like a tiger. AI isnt a tiger. Its actually more like a big river of water. A tiger is dangerous in a very different way than water. Water is something that you can build a boat for, you can learn to swim, or you can create dams that make electricity. Its not trying to eat us, its not angry at us. It doesnt have any emotion or feelings or thoughts. Its just like a powerful force. It is an opportunity.

Link:

AI art tool Midjourney has all the answers to 'what if' - The Indian Express

Posted in Ai | Comments Off on AI art tool Midjourney has all the answers to ‘what if’ – The Indian Express

Naval Surface Force plan to accelerate AI adoption expected ‘in the next few weeks’ – FedScoop

Posted: at 9:13 pm

Written by Brandi Vincent Jul 29, 2022 | FEDSCOOP

Naval Surface Force, U.S. Pacific Fleet has reached the final phase of drafting its first-ever, overarching data and artificial intelligence strategy and implementation plan and aims to share those resources more broadly before this summer ends, a key task force leader told FedScoop on Friday.

Naval Surface Force organizations perform a variety of administrative, maintenance, workforce and operational training functions and help equip and staff Navy warships before they are deployed to the respective fleet commands for military missions. As one of the Navys largest enterprises, the organizations capture, produce and rely on massive amounts of data.

About a year ago, the Navy created Task Force Hopper to produce a complex digital infrastructure and cultural transformation to ultimately drive AI-enabled capabilities across the surface force.

We understand that AI is the most powerful decision engine for the readiness and sustainment of our ships and for warfighting and we wanted to see, as an enterprise, how we can best accelerate this effort. This is about AI adaptation, Task Force Hopper Director Capt. Pete Kim told FedScoop in an interview.

Kim has served in that role since the task forces inception, and also leads the Surface Analytics Group.

To him and his team, the task force represents a broad enterprise approach about hiring the right digital talent, making sure our teams have the right development platforms for data excellent exploitation and creating new processes for the force.

As such a sprawling enterprise, the surface force has many projects and initiatives within the Navy and in collaboration with academia and industry, resulting in siloed but duplicative efforts and gaps in transparency, among other issues.

We all know that AI is totally dependent on high-quality and accurate data. We believe that data management is the most important discipline for our era, and thats what we wanted to focus on. So, one of our first initiatives was to draft a data AI strategy and implementation plan for the force to establish that structure, Kim explained.

That in-the-making document will be unclassified, but the audience is really the surface enterprise, he added, suggesting that only a summary may be publicly disseminated.

Right now, it is in the final stages of drafting. We hope to push that out to the enterprise here in the next few weeks, the director said.

Broadly, the strategy and accompanying plan will detail why the Naval Surface Force is taking the approach that it is, and include directions for how officials will support the chosen framework.

It really focuses on making data AI-ready across the board and so it gets into data management, data governance, digital talent, ensuring that weve got clearly defined use cases, Kim noted.

The strategy anchors on what he deemed a federated model with a number of different supporting organizations.

Its about central data governance, with a central data catalog and then having these decentralized analytic and AI development nodes at different places in the enterprise, where people know the data the best, Kim said.

While nodes at one military location, or associated with one team, might focus on maintenance, others could concentrate on staffing or other needs. Kim added that the nodes will hone in on different categories of AI, depending on the use case and what products and models were trying to build.

Offering two examples of nodes aligned with readiness, he pointed to the Surface Analytics Group that he runs, which assists with the force generation of about 168 warships, and a separate group that zeroes in on operational safety and risk indicators.

Readiness is our focus here at the [Naval Surface Force]. Program offices and warfare centers that are, lets just say, focused on lethality or warfighting, we expect those organizations to have nodes working on their specific areas and use cases, Kim said. Then the follow on here is, again, that common development environment where we as an enterprise have transparency on all the different projects that are going on and then we can leverage each others works.

Once the strategy and implementation plan are released, next steps will prioritize empowering each of the AI nodes.

These nodes are going to support all the priority projects. So, its going to be about, Hey, do you have the right tools in this development environment? Do you have the right digital talent to move out on this? Are there different datasets that you need that the rest of the enterprise can help you out with? So, well really focus in on the select nodes to support those priority projects, Kim said.

Originally posted here:

Naval Surface Force plan to accelerate AI adoption expected 'in the next few weeks' - FedScoop

Posted in Ai | Comments Off on Naval Surface Force plan to accelerate AI adoption expected ‘in the next few weeks’ – FedScoop

Exclusive: NHS to use AI to identify people at higher risk of hepatitis C – The Guardian

Posted: at 9:13 pm

The NHS is to use artificial intelligence to detect, screen and treat people at risk of hepatitis C under plans to eradicate the disease by 2030.

Hepatitis C often does not have any noticeable symptoms until the liver has been severely damaged, which means thousands of people are living with the infection known as the silent killer without realising it.

Left untreated, it can cause life-threatening damage to the liver over years. But with modern treatments now available, it is possible to cure the infection.

Now health chiefs are launching a hi-tech screening programme in England in a fresh drive to identify thousands of people unaware they have the virus.

The scheme, due to begin in the next few weeks, aims to help people living with hepatitis C get a life-saving diagnosis and access to treatment before it is too late.

The NHS will identify people who may have the virus by using AI to scan health records for a number of key risk factors, such as historical blood transfusions or an HIV diagnosis.

Anyone identified through the new screening process will be invited for a review by their GP and, if appropriate, further screening for hepatitis C. Those who test positive for the virus will be offered treatment available after NHS England struck a deal with three major pharmaceutical companies.

Prof Graham Foster, national clinical chair for NHS Englands hepatitis C elimination programmes, said the scheme marks a significant step forward in the fight to eliminate the virus before 2030. It will use new software to identify and test patients most at risk from the virus potentially saving thousands of lives, he added.

Hepatitis C can be a fatal disease which affects tens of thousands across the country but with unlimited access to NHS treatments, innovative patient finding initiatives such as this one we will continue to boost the life chances of thousands of patients by catching the virus even earlier.

Hepatitis C deaths have fallen by over a third in five years. Recent data shows the number of deaths from the virus has decreased by 35%, from 482 in 2015 to 314 in 2020 in England. New cases have also fallen from 129,000 in 2015 to 81,000 in 2020, according to the UK Health Security Agency.

Hepatitis C is usually spread through blood-to-blood contact. It can be spread by sharing unsterilised needles particularly needles used to inject recreational drugs.

The actor Pamela Anderson contracted hepatitis C while married to the musician Tommy Lee, who had a history of drug abuse, when the couple shared a tattoo needle. Anderson, 55, was cured after taking antivirals.

Experts say the recent fall in deaths is largely down to earlier detection of the virus coupled with improved access to treatments.

Rachel Halford, the chief executive of the Hepatitis C Trust, said: Thanks to the brilliant advances we have seen in hepatitis C treatment in recent years we have a real opportunity to eliminate the virus as a public health concern in the next few years. However, in order to do so we need to make progress in finding those living with an undiagnosed infection and refer them into treatment.

Sign up to First Edition, our free daily newsletter every weekday morning at 7am BST

That is why the announcement of this new screening programme is such welcome news. Primary care is where we are most likely to find those who have been living with an undiagnosed infection for many years.

There has been brilliant work to expand testing in a wide range of settings in recent years but we have not yet seen the advances we need to see in primary care. The rollout of this screening programme is therefore another crucial step towards achieving elimination.

NHS staff are also visiting at-risk communities in specially equipped trucks to test for the virus and carry out liver health checks with portable scans to detect liver damage.

The health service is aiming to wipe out the virus in England before the global goal of 2030 set by the World Health Organization.

Excerpt from:

Exclusive: NHS to use AI to identify people at higher risk of hepatitis C - The Guardian

Posted in Ai | Comments Off on Exclusive: NHS to use AI to identify people at higher risk of hepatitis C – The Guardian

Tesla, Drive.AI top key players in the Intellgent Driving market – Teslarati

Posted: at 9:13 pm

Tesla and Drive.ai are top key players in the Intellgent Driving market according to a new Global Intellgent Driving Market report. The report gives an overview of the 2021 growth of Intellgent Driving and how that growth has significant changes from the previous year as our global economy recovers from the impacts of Covid-19.

The report looks at business models and marketing strategies used by key market players such as Tesla and Drive.ai and how they are able to stay competitive while accelerating their business growth in the market.

It gives several market scenarios and recommendations for solutions to help these companies to stay ahead of their competitors. It also provides insights into how the Covid-19 pandemic impacted the market.

Additionally, it highlights the trends that either influenced or challenged the market during the pandemic.

The report identifies existing opportunities and strategies that a company can use to increase its competitive edge.

Tesla, Drive.ai, and Mobileye are just three of the top players market identified by the report. The Intellgent Driving market types are divided into two categories: autonomous vehicles and autonomous systems.

For Autonomous vehicles, the two types of applications include passenger vehicles and commercial vehicles.

The full list is as follows:

Many people forget that Tesla isnt just an automaker but its also a technology company. On September 30, 2022, Tesla will hold its second AI Day and theres a chance we could see the Optimus Bot prototype.

During the Q2 2022 earnings call, Elon Musk spoke briefly about AI Day.

Were hosting our AI Day in a few months.

I think people will be amazed at what were able to show off on AI Day. So basically, theres a tremendous amount to look forward to in the second half of this year.

While attending Teslas first AI Day in person last year, I watched Ganesh Venkataramanan, Teslas senior director of Autopilot hardware and the leader of the Dojo project.

Venkataramanan pointed out the insatiable demand for speed and capacity for neural network training.

Then he shared Teslas goal which is to achieve the best artificial intelligence training performance while supporting the larger and more complex models while also being both power efficient and cost effective.

We thought about how to build this and we came up with a distributed compute architecture. After all, all the training computers are distributed computers in one form or the other.

Go here to read the rest:

Tesla, Drive.AI top key players in the Intellgent Driving market - Teslarati

Posted in Ai | Comments Off on Tesla, Drive.AI top key players in the Intellgent Driving market – Teslarati

Drover AI is using computer vision to keep scooter riders off sidewalks – TechCrunch

Posted: at 9:13 pm

Shared micromobility companies have been adopting startlingly advanced new tech to correct for the thing that cities hate most sidewalk riding. Some companies, like Bird, Neuron and Superpedestrian, have relied on hyperaccurate GPS systems to determine if a rider is riding inappropriately. Others, like Lime, have started integrating camera-based computer vision systems that rely on AI and machine learning to accurately detect where a rider is.

The latter camp has largely leaned on the innovations of Drover AI, a Los Angelesbased startup that has tested and sold its attachable IoT module to the likes of Spin, Voi, Helbiz, Beam and Fenix to help operators improve scooter safety and, most importantly, win city permits.

Drover, which was founded in May 2020, closed out a $5.4 million Series A Wednesday. The startup will use the funds to continue building on the next generation of PathPilot, Drovers IoT module that contains a camera and a compute system that analyzes visual data and issues commands directly to the scooter. Depending on the citys needs, the scooters will either make noises to alert a rider that theyre driving on the sidewalk or slow them down. The new version, called PathPilot Lite, will do much of the same, except it will be more integrated, better and cheaper, says Drovers co-founder and chief business officer Alex Nesic.

Drover has modules on over 5,000 vehicles with orders for over 15,000 more that the company needs to deliver by the end of the year, according to Nesic.

Getting those next-gen modules into production will require hiring a few more minds across the engineering and project management side of things, as well as in government relations and communications over in Europe to aid Drovers expansion across the pond.

Nesic said Drover also intends to hire a software engineer to help build out the data dashboards the company offers to micromobility partners.

Drovers operator-facing beta dashboard that breaks trips down by infrastructure used. Image Credits: Drover AI

We have a beta dashboard that shows a color-coded version of what trips look like broken down by infrastructure, how much time each vehicle spends on each section, and in aggregate across an entire fleet, Nesic told TechCrunch. We have a parking validation dashboard where for any one of your vehicles deployed in the city, you can see where the ride ended, what our AI scored the parking job as along with a photo. So all these tools were offering our operator customers that they could build themselves based on the data were already sharing with them, but they just dont have the bandwidth, so these customer-facing tools are a value add.

Drover also sells its data to cities and is exploring the use of distributed cameras moving through cities to build out a suite of tools that could potentially provide a city-facing dashboard that displays information like the state of infrastructure or bike lane violations, which is a pet project of Nesics.

Our system can tell you, for example, the rider was on the bike lane for 20% of the time, 30% of the time on the sidewalk and the rest in the street, said Nesic. That can inform a lot of policy decisions on where to put bike lanes or whether the bike lanes youve invested in are working.

Drover has been receiving interest from transportation agencies like Transport for London, as well as insurance companies that want this kind of granular data to understand how new mobility modes are being used in the infrastructure.

There are some who say the future of micromobility is really in owned vehicles, rather than shared. If thats the case, it stands to reason that shared micromobility sets the trends that future private scooters will have to live by. Advanced rider assistance systems are becoming table stakes for operators moving forward who want to win cities, and Nesic thinks policy for private vehicles might soon follow in fact, hes hoping it does.

Part of the money we raised is gonna go towards exploring other integrations further up the supply chain with vehicle manufacturers and IoT manufacturers, said Nesic. The real goal is to drive the cost down, and if we can, already have our tech be a feature set of the next gen IoT that has compute capability already, and then were just licensing Drovers AI, which is equipped to handle different infrastructure across the globe.

But thats way down the line. For the short-term, Drover is still hyperfocused on expanding off the backs of shared scooter companies that are increasingly hearing demand for this kind of tech from cities.

Drovers Series A was led by Vektor Partners, a VC firm focusing on the future of mobility. The firm recently raised a 125 million fund for sustainable mobility, which is where Drovers recent raise came from.

Follow this link:

Drover AI is using computer vision to keep scooter riders off sidewalks - TechCrunch

Posted in Ai | Comments Off on Drover AI is using computer vision to keep scooter riders off sidewalks – TechCrunch

Mark Zuckerberg ignores objections, says Instagram will show twice as much A.I.-recommended content by end of 2023 – Fortune

Posted: at 9:13 pm

Despite loud and sustained protests from Instagram users, Meta chief Mark Zuckerberg says the changes that have rolled out to the social media platform are not only going to remain, theyre going to intensify.

In an earnings call Wednesday, Zuckerberg said the photo-sharing app will double the amount of A.I.-recommended content it shows by the end of next year.

One of the main transformations in our business right now is that social feeds are going from being driven primarily by the people and accounts you follow to increasingly also being driven by A.I. recommending content that youll find interesting from across Facebook or Instagram, even if you dont follow those creators, he said.

At present, about 15% of the content in a persons Facebook or Instagram feed is recommended by the companys A.I. Thats reflected in posts from people, groups, or accounts users dont follow. By the end of next year, Zuckerberg said, those numbers will double.

The basis for the increase is increased engagement, he says, which increases monetization.

Social content from people you know is going to remain an important part of the experience and some of our most differentiated content, he said. But increasingly, well also be able to supplement that with other interesting content from across our networks.

The push for this AI content, which is currently best reflected in the number of Reels that appear on Facebook and Instagram pages, comes as TikTok threatens Metas dominance in the social media space. That was reflected in Metas earnings Wednesday as the company reported the first revenue decline in its history as a public company.

The push is working, though. Meta says it has seen a 30% increase in the amount of time people are spending with Reels.

The cost of that success, though, is customer and ambassador goodwill. Facebook suffered a backlash from Instagram power user Kylie Jenner earlier this week, who shared a post that read stop trying to be TikTokI just want to see cute photos of my friends. Sister Kim Kardashian, who has 326 million Instagram followers, soon followed suit.

That caught the attention of Instagram CEO Adam Mosseri, who posted avideoonTwitterto address the criticisms, pledging Were going to continue to support photos. Its part of our heritage.

That said, I need to be honest, he added. I do believe that more and more of Instagram is going to become video over time. We see this even if we change nothing.

Sign up for theFortune Features email list so you dont miss our biggest features, exclusive interviews, and investigations.

Read more here:

Mark Zuckerberg ignores objections, says Instagram will show twice as much A.I.-recommended content by end of 2023 - Fortune

Posted in Ai | Comments Off on Mark Zuckerberg ignores objections, says Instagram will show twice as much A.I.-recommended content by end of 2023 – Fortune

AI Innovation Consortium Accelerates the Evolution of the Industrial Metaverse – Digital Journal

Posted: at 9:13 pm

HOUSTON, TX, July 31, 2022 /24-7PressRelease/ The AI Innovation Consortium is collaborating with the University of Houston (UH), NVIDIA, and TechnipFMC to advance the creation of industrial metaverse applications as part of the digital oilfield laboratory initiative. This collaboration brings together end-users, technology providers, and domain experts to build commercially relevant use cases and accelerator tools and break through informational silos. These resources ultimately enable an interconnected world with an industrial metaverse.

Adam Berg, Manager of Learning Solutions at TechnipFMC, has been working with the UH College of Technology and the AI Innovation Consortium to test and pilot a successful augmented reality program to train and manage upstream resources. TechnipFMC, a leading engineering and services company, is a driving force for the augmented, immersive, and virtual reality program known as X-Reality or the XR program.

The purpose of the XR Network is to bring together the knowledge, tools, and people necessary to leverage XR technologies, to achieve business goals, and to transform industry environments. Our work with the college and the consortium has enabled us to move beyond concept to pilot and implementation, added Berg.

Berg and his team have received some of the digital oilfield labs foundational tools in the form of digital twins of assets and natural language processing and AI-enhanced computer vision pipelines. These tools have helped bridge the gap between the real world and the metaverse, making it possible to create augmented reality environments for training, repair, and quality control without many of the manual complexities.

This industrial metaverse initiative is part of the University of Houston College of Technologys digital oilfield laboratory to bring together faculty and students with AI Innovation Consortium partners such as TechnipFMC and NVIDIA to explore the development of commercially relevant digital solutions. These activities involve some of the largest manufacturers and oil and gas companies in the world. The goal of the consortium is to use technologies such as artificial intelligence, augmented reality, and digital twins to create and build foundational tools for training solutions with unlimited scalability for enterprise.

Building on top of NVIDIA accelerated computing platforms including NVIDIA DGX systems, the NVIDIA Jetson edge AI platform, and NVIDIA OVX, consortium members have access to software platforms like NVIDIA Omniverse to build and accelerate the evolution of the metaverse.

With the help of NVIDIAs Global Energy Director, Marc Spieler, the initiative is bringing these tools to end-users like Berg and others.

Our work with the AI Innovation Consortium and its ecosystem of partners and members is enabling the critical collaboration required between academic research, technology, and industry to scale artificial intelligence to achieve measurable outcomes in industries like energy and manufacturing. Using NVIDIA platforms like Omniverse and Modulus for digital twins and physically accurate simulations will accelerate and streamline returns on digital investments by reducing downtime and safety incidents, said Spieler.

With the support of industry groups like the AI Innovation Consortium, domain experts from across the table are working together toward common goals with barriers of information sharing.

Konrad Konarski, Chairperson of the AI Innovation Consortium added:

We are currently working to build the worlds largest portfolio of digitized metaverse assets and environments for both the oilfield and specific manufacturing sectors. This means a maintenance manager, an operations technology expert, or whoever is responsible for a metaverse technology project will be able to pick up an augmented reality platform or a wearable computer, or simply a smartphone, and seamlessly interconnect their real-world operating environment to and from the metaverse.

The implications of this effort mean that the deployment of augmented reality, additive manufacturing, and digital twin technology platforms and the scalability of these platforms across any organization will be exponentially simpler.

The collective collaboration is also deeply rooted with the support of academic domain experts and professors at the University of Houston College of Technology, including consortium trustee Professor of Practice David Crawley. Professor Crawley, along with several members from the University of Houston College of Technology, has been working with the NVIDIA Isaac robotics platform for enhanced development, simulation, and deployment, as well as with other industrial robotic systems and cloud-based command and control platforms, as part of the AI digital oilfield that directly collaborates as part of the industrial metaverse initiative.

The consortiums academic ecosystem is a critical part of developing the future workforce. This workforce will be capable of using collaborative robot, or cobot, technologies as part of the evolution of industry 5.0, and experienced in developing and deploying artificial intelligence, augmented reality, and digital twin technologies. The joint initiative at our AI digital oilfield lab with the AI Innovation Consortium brings together professors, undergraduates, and postgraduate students on the development of the industrial metaverse, lights out manufacturing, and broader AI and augmented reality application systems and leverages their expertise and aligns academia with industry, said Professor Crawley.

This academic and collaboration footprint also extends to other universities and technology institutes. Among those, PhD student William Aiken from the University of Ottawa in Canada and teams from the MxD smart manufacturing center in Chicago are playing a part in building darkfactory systems with the help of the industrial metaverse.

The mission of the AIIC is to bring together industry leaders across different domains to support a holistic perspective on Artificial Intelligence to build standards and best practices that help drive technology adoption, evolve privacy, data governance, and data biases guiding principles, and effectively align AI evolution with the most relevant industry challenges and objectives now and in the future.

Related Link:https://aiinnovationconsortium.org

Press release service and press release distribution provided by http://www.24-7pressrelease.com

Read the original post:

AI Innovation Consortium Accelerates the Evolution of the Industrial Metaverse - Digital Journal

Posted in Ai | Comments Off on AI Innovation Consortium Accelerates the Evolution of the Industrial Metaverse – Digital Journal

7 Intriguing K-Pop MVs That Adopted Futuristic Concepts – soompi

Posted: at 9:12 pm

K-pop has been around for several decades, and it continues to evolve as time passes. Out of the visual themes that have seen the light of day, the adoption of futuristic concepts has always been intriguing, especially given the anticipation of how the world progresses and how close these cinematographies are to predicting the distant future.

Without further ado, here are seven K-pop music videos that adopted futuristic concepts.

VIXX makes use of artificial intelligence in this epic love story. The plot revolves around Hongbin who attempts to bring his late partner back to life as a cyborg and successfully achieves it. Unfortunately, the couples happy ending is cut short when their reunion is jeopardized by a group of men in black. It seems like it was an error to proceed with such a delicate operation in a world that adamantly fights it!

AleXas earlier cinematography mainly focused on artificial intelligence. In this track, it seems like different versions of herself are fighting one another, and each version has a certain distinctive robotic feature, some more disturbing than others. In a fight for survival, the artist is staying true to her Do Or Die motto.

Kais solo debut is nothing short of mind-blowing, and the scenery he chose for his music video fits the advanced technology theme perfectly. While the idol is serenading hearts and rocking moves, his background shows a futuristic city with flying vehicles, drones, and lots of plugs practically everywhere. Add to that Kais glitching as he appears and disappears mid-performance, which looks like high-quality holograms!

2NE1 takes things up a notch, showing us not only a futuristic universe but also a post-apocalyptic one. A virtual program is recruiting citizens with promises of a heaven-like life. They put them to sleep while plugging them into a simulated but far from accurate paradise, further disconnecting them from the real world. In this music video, the quartet teams up to put an end to this masquerade and bring the population back home.

ONF is incessantly watched, chased, and caught while living in an insanely wired world. The members continuously rebel against the system and its high tech, which seems to be doing more harm than good, all while living their best lives. Optimistic and courageous, they eventually turn things to their advantage and put an end to the fight. To be able to drag a whole army of robots to dance along is truly something beautiful to witness!

SMs most recent girl group has adopted futurism as their overall concept with their AI counterparts making up the second half of their group. This being said, its only natural if they feature bits of it in all of their music videos. With Girls being their latest comeback, the members are shown being part of a game, metaverse style, only they are playing for their survival in the real world.

B.I is yet another artist that embraces the theme in his videography, except this time no one is attacking the ultra-modern city. Instead, he takes us on a jolly ride around in the company of friends and a potential love interest. For some reason, one can assume (and hope) that the vibe of the music video seemingly mirrors B.Is current life since he looks so relaxed and happy.

Honorable mentions:

SuperMs Jopping, NCT 127s Superhuman, and MIRAEs KILLA.

Which K-pop music video has adopted the coolest futuristic concept? Let us know in the comments below!

Esmee L.is a Moroccan lively dreamer, writer, and Hallyu enthusiast.

How does this article make you feel?

See the original post here:
7 Intriguing K-Pop MVs That Adopted Futuristic Concepts - soompi

Posted in Futurist | Comments Off on 7 Intriguing K-Pop MVs That Adopted Futuristic Concepts – soompi

Nike launches futuristic AR experience in China – Creative Review

Posted: at 9:12 pm

Nike has unveiled Trove, a stylish digital experience geared towards its customers in China, which celebrates the brands collaborations with Matthew M Williams, who helms the Maison Givenchy fashion house, and music artist and producer G-Dragon.

Optimised for use on smartphones, the experience uses AR tech to allow users to pan around the unique worlds built for each collaborator according to their chosen aesthetic language. Though the design concepts are visibly distinct from one another, they both share a futuristic feel.

Williams is akin to a glitchy yoga retreat in the desert, while the atmospheric area created for G-Dragon feels as though an architect has gone to town on a cavernous underground bunker.

The experience links to additional content where people can read more about the Nike collaborations with the two creatives, and they can also generate a personalised business card inspired by each theme. The local version of the experience contains quick links to products embedded in the AR environments.

The platform was brought to life by BBH China, production studio Unit9 and Nikes global catalyst brand management team Nikes global marketing arm which works on the brands collaborative projects across fashion, design, entertainment, music and art.

All of the platform copy is written in simplified Chinese, so people requiring another language to navigate the experience might benefit from their mobile browsers live translation function if one is available.

Access Nikes Trove experience via smartphone here

Continue reading here:
Nike launches futuristic AR experience in China - Creative Review

Posted in Futurist | Comments Off on Nike launches futuristic AR experience in China – Creative Review

The First Rolls-Royce EV Has a Suspension That Can See The Future – Jalopnik

Posted: at 9:12 pm

The Rolls Royce Spectre is the firms first EV. Photo: Rolls Royce

Like every other car maker, Rolls-Royce is going electric. By 2030, the firm says itll be a fully electric car maker, and is now preparing to launch its first EV. Coming next year, the Spectre is now undergoing an impressive array of tests to put the electric car through its paces.

After running the Spectre EV close to the Arctic Circle to test it in extreme temperatures, Rolls-Royce has now headed to a setting more appropriate for the cars life on the road: the French riviera.

The move is all part of its aim to cover more than 1.5 million miles to test the car in a variety of settings. In France, the Spectre EV will run on the Autodrome de Miramas facility, which previously held the French Grand Prix way back in 1926. The car will also cover the roads around the Cte dAzur.

he Rolls Royce Spectre: Perfecly noiseless. Photo: Rolls Royce

While soaking up the sun, and hopefully a few miles, Rolls-Royce will put the Spectres complex suspension through its paces.

G/O Media may get a commission

Three lengths availableNo more worrying about kinks in your hose, because this one is flexible to stretch around sharp corners and bulky lawn furniture. Get your jobs done in less time without having to fiddle with your hose.

In the south of France, the Rolls-Royce Spectre will cover almost 390,000 miles to ensure it delivers the firms signature magic carpet smooth ride. In order to do this while carrying around the added weight of an EV, Rolls-Royce has developed new hardware and software to control the Spectres suspension.

The new system sees the car read the road ahead and, on straights, automatically decouple the cars anti-roll bars, allowing each wheel to act independently. This, the firm says, will stop the Spectre from rocking whenever it hits an undulation in the road.

Once a corner is spotted in the road ahead, the components are re-coupled, the suspension dampers stiffen and the four-wheel steering system prepares to activate.

Rolls Royce Spectre putting its suspension through its paces. Photo: Rolls Royce

According to Rolls-Royce, more than 18 sensors are monitored when cornering, and the cars built-in computer makes minute adjustments to the steering, braking, power delivery and suspension.

As well as that high-tech suspension system, the Rolls-Royce Spectre also boasts the title of the most aerodynamic Rolls Royce of all time.

The pursuit of ultimate aerodynamics is personified with the redesigned Spirit of Ecstasy mascot on the front of the Spectre. But the reduction of drag on the car goes much deeper than its emblem.

The new and improved Spirit of Ecstasy. Photo: Rolls Royce

The companys spaceframe architecture and extensive wind tunnel testing and digital modeling have helped Rolls engineers cut the Spectres drag coefficient to just 0.25. That doesnt quite match the 0.20 that Mercedes claims for its EQS sedan, but does put the car on a par with the Honda Insight.

Before all that aerodynamic goodness can head out to customers, Rolls-Royce says it still has a further 600,000 miles of testing to cover with the car. After that, the first customer deliveries of the Spectre will begin in the fourth quarter of 2023.

Read the rest here:
The First Rolls-Royce EV Has a Suspension That Can See The Future - Jalopnik

Posted in Futurist | Comments Off on The First Rolls-Royce EV Has a Suspension That Can See The Future – Jalopnik