Connect with us


What Norwegians are learning as they pioneer autonomous ships



What Norwegians are learning as they pioneer autonomous ships

Norway is the perfect place to develop autonomous ships. Norwegians love boats, they love technology, and they love to cooperate. On top of that, autonomous ships have practical applications that could affect the lives of many in Norway.

Mary Ann Lundteigen manages SFI AutoShip, an eight-year programme funded by the Research Council on Autonomous Ships and Operations and 22 partners. She is a professor at the department of engineering cybernetics at the Norwegian University of Science and Technology (NTNU), where the centre is hosted.  

“Even though some people think fully autonomous ships exist, as far as I know, all first commercial ships with autonomy will start at degree 2,” says Lundteigen. “Degree 2 means the ship is remotely controlled but has at least one seafarer on board.

“Reaching degree 3 – remote control and with no crew on board – is a bigger challenge, so a test period at degree 2 is a good way to gain experience. And then there’s degree 4, which is fully autonomous – where the ship can operate fully on its own and with no seafarers on board. Degree 4 is out of the question for now – at least commercially. But it is an area of active research here in Norway.”

More research is focused on developing small autonomous vessels that operate in restricted areas. This simplifies the work – and, it turns out, it may provide a solution to a problem many Norwegians face every day.

“A part of our daily lives in Norway is crossing fjords to get to work,” says Frode Halverson, cluster manager for Ocean Autonomy Cluster. “Bridges and tunnels are expensive. Ferries are a better option in many situations.”

Operating several small ferries would be less expensive and more environmentally friendly than operating one big ferry. With smaller ferries, though, the cost of the crew is proportionally higher than for big ferries, so reducing the crew size has a bigger payback. Autonomous ship technology is one way of making ferries smaller and smarter. 

A lab to design shore control centres

NTNU is currently working with partners to test a prototype of an autonomous passenger ferry, and a control room to intervene remotely as needed. Unlike self-driving cars, autonomous boats can be run by a remote operator in a cost-effective manner. But when the person controlling the vessel is not on board, a new set of challenges arise, including the fact that the captain may not be the one who goes down with the ship.

“We are making shore-based control rooms to monitor and potentially take over control of autonomous ships,” says Ole Andreas Alsos, head of NTNU Shore Control Lab, which is part of SFI AutoShip. “Our lab does not serve as a control room for a specific purpose. Instead, it’s a shore control lab where we do research on different control room designs. We want to learn how to build the best control rooms for different applications: urban ferries, maritime autonomous surface ships, big deep-sea shipping, short sea shipping, car ferries, and so on.” 

The physical design consists of a lot of screens and monitors and a very powerful computer using one of the best graphics cards built for gaming applications – the RTX 3090. Currently, operators inside the control room only get visual and audio information from the ferry. But NTNU is looking at ways of replicating the feel of the ship inside the lab. In the future, this might include haptic feedback, so operators can feel the wind and waves, and if a docking is hard, they will feel it.

Studying operators’ reactions

“Our control room has extra features that help us study how operators behave,” says Alsos. “We have cameras, of course, to see what is happening in the control room. But operators also wear wrist bands so we can measure their heart rate variability and their skin conductance, which indicates their stress levels.

“We use glasses with eye-tracking so we can see where they are looking. That is a good indication of where they have their attention. We also measure their pupil dilation. From the size of their pupils, we can get a good impression of their cognitive loads – a large pupil means high cognitive load; a small pupil means lower cognitive load,” he explains.

Video feeds and sensor data from the control room and operator are sent to another lab, where researchers can speak through a microphone to give operators instructions on what to do. The researchers can also observe control room operators’ behaviours, communication styles and stress levels. 

“Autonomous shipping is very new, so we are paving the way. To do this, we collaborate closely with the coastal authorities, who know that ships will gradually become more and more autonomous – not tomorrow or next year, but at some point in the future”
Ole Andreas Alsos, NTNU Shore Control Lab

There is also a large meeting room where video feeds from the experiments or usability tests are displayed. This allows developers, product managers, project leaders and other stakeholders to follow along. This feedback system helps stakeholders assess different control room designs.

“To explore different situations, we built a ferry simulator, which is like a digital twin,” says Alsos. “It behaves exactly like a real ferry, but it has at least one advantage – we can create situations that rarely happen in real life or are too dangerous to test. We can simulate kayakers coming towards the vessel, or people falling into the water. We can simulate fires on board and see how operators react and how much time it takes them to take over the control of the ferry.”

One area of concern is cognitive underload. During long, monotonous trips, the captain on board a ship tends to get bored and may no longer be able to react quickly when needed. The same effect will be even more significant for control room operators, who won’t even be on the vessel. To address this problem, NTNU is exploring ways to keep operators thinking just the right amount. If they do too little, they get bored; if they do too much, they get stressed.

“We might have one person operating one ship,” says Alsos. “If the situation is very complex, we could have several people operating one ship. At the other end of this scale, we could have one person operate several ships, or a team of people operating a whole fleet of ships. The scenario we are currently testing is having two people operate up to 20 small passenger ferries. We are trying to find the best user interface for this specific case.

“There are some international standards on traditional control room design,” he adds. “But autonomous shipping is very new, so we are the ones paving the way. To do this, we collaborate closely with the coastal authorities, who are very proactive and flexible. They are prepared to change some of the regulations to make this happen. They know that ships will gradually become more and more autonomous – not tomorrow or next year, but at some point in the future.”

Artificial intelligence versus rules-based decision engines

“One obvious difference between self-driving cars and autonomous ships is that cars travel much faster,” says Lundteigen. “In a dangerous situation, a car has to intervene very quickly. A ship is larger and slower – and there are, of course, fewer of them. A ship needs much more predictive capability to understand what might happen further into the future to be able to prepare for the situation well in advance. Cars can be brought very quickly to a halt, but large ships take a long time to stop.”

A challenge for autonomous ships is to get them to communicate their state and their intention – not only to the operator, but to other vessels to avoid deadlocks and accidents. To make matters worse, sometimes autonomous ships move in a funny way because they don’t behave like humans. This makes it difficult for human pilots and seafarers to trust an autonomous ship. 

“This is an example of ‘explainable AI’ or ‘automation transparency’,” says Alsos. “Complex AI systems and robots and autonomous ships need to communicate their state and future intention to us so that we can trust them and make good decisions based on what they communicate. That black AI box needs to be transparent to us.”

Even though trials have been run with ships using artificial intelligence (AI), it’s still not certain that’s how they will operate when they are commercialised.

“It’s very difficult to test deep learning, because you never know what percentage of the cases you’ve covered with your testing. You train the system on the data you have, from the situations you know. But in real life, variations of those situations usually arise”
Ørnulf Jan Rødseth, Sintef Ocean

Ørnulf Jan Rødseth, senior scientist at Sintef Ocean, believes it will be rules-based or directly programmed. “It’s very difficult to test deep learning because you never know what percentage of the cases you’ve covered with your testing. You train the system on the data you have, from the situations you know. But in real life, variations of those situations usually arise,” he says.

“What I expect is that you will use more rule-based decision-making for anti-collision, and I think a very important element is that the system understands when a situation is not quite predictable. In that case, the system should ask for remote control,” adds Jan Rødseth.

“The main problem in developing an anti-collision system is in trying to guess what a manned ship is going to do,” he says. “Some people say that autonomous systems behave in ways that are difficult for humans to interpret, but the reverse is also true. Autonomous systems have trouble predicting people, especially in complex situations. If all ships were autonomous and they all cooperated, it would be quite straightforward.”

One of the things ships will have to cooperate on is collecting and correlating data, according to Svein David Medhaug, senior surveyor at the Norwegian Maritime Authority.

“The system that is supposed to steer and navigate a vessel through dense fairways, interacting with conventional vessels, needs to act in accordance with the Convention on the International Regulations for Preventing Collisions at Sea, 1972 (COLREGs). Data models are needed for systems to recognise situations and react to surrounding traffic.” 

Beyond small ferries

“The autonomous small ferry is not the only use case we are considering for autonomous ship technology,” says Alsos. “We are also working on big car ferries that will be partially autonomous. The ferry will plot a course to the destination and follow the route automatically. The ferry will also regulate speed.

“We are looking at auto docking too, where the ferry docks without human intervention. For the time being, we are aiming for partial autonomy. The crew still needs to be present to monitor the systems and look out for other ships.”

Autonomous crossing is expected to save a lot of fuel. Very few people can run the ferry more efficiently than an autonomous system. The other advantage is safety. The captain doesn’t have to control the handles, which frees up his or her attention. All the captain has to do is monitor the systems and look out for other ships.

Thanks to what Norway is doing now, some time in the future, ships will operate without crews. A day-shift captain will be able to go into work in the morning, operate one or more ships from a control room, and be home in time for dinner. When this becomes a reality, the captain will never again have to go down with the ship – and nor will anybody else.

Go to Source

Click to comment

Leave a Reply


ML-driven tech is the next breakthrough for advances in biology



ML-driven tech is the next breakthrough for advances in biology

Image Credit: kentoh/Shutterstock

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more

This article was contributed by  Luis Voloch, cofounder and chief technology officer at Immunai

Digital biology is in the same stage (early, exciting, and transformative) of development as the internet was back in the 90s. At the time, the concept of IP addresses was new, and being “tech-savvy” meant you knew how to use the internet. Fast-forward three decades, and today we enjoy industrialized communication on the internet without having to know anything about how it works. The internet has a mature infrastructure that the entire world benefits from.

We need to bring similar industrialization to biology. Fully tapping into its potential will help us fight devastating diseases like cancer. A16z has rephrased its famous motto of “Software is eating the world” to “Biology is eating the world.” Biology is not just a science; it’s also becoming an engineering discipline. We are getting closer to being able to ‘program biology’ for diagnostic and treatment purposes.

Integrating advanced technology like machine learning into fields such as drug discovery will make it possible to accelerate the process of digitized biology. However, to get there, there are large challenges to overcome.

Digitized biology: Swimming in oceans of data

Not so long after gigabytes of biological data was considered a lot, we expect the biological data generated over the coming years to be counted in exabytes. Working with data at these scales is a massive challenge. To face this challenge, the industry has to develop and adopt modern data management and processing practices.

The biotech industry does not yet have a mature culture of data management. Results of experiments are gathered and stored in different locations, in a variety of messy formats. This is a significant obstacle to preparing the data for machine learning training and doing analyses quickly. It can take months to prepare digitized data and biological datasets for analysis.

Advancing biological data management practices will also require standards for describing digitized biology and biological data, similar to our standards for communication protocols.

Indexing datasets in central data stores and following data management practices that have become mainstream in the software industry will make it much easier to prepare and use datasets at the scale we collectively need. For this to happen, biopharma companies will need C-suite support and widespread cultural and operational changes.

Welcome to the world of simulation

It can cost millions of dollars to run a single biological experiment. Costs of this magnitude make it prohibitive to run experiments at the scale we would need, for example, to bring true personalization to healthcare — from drug discovery to treatment planning. The only way to address this challenge is to use simulation (in-silico experiments) to augment biological experiments. This means that we need to integrate machine learning (ML) workflows into biological research as a top priority.

With the artificial intelligence industry booming and with the development of computer chips designed specifically for machine learning workloads, we will soon be able to run millions of in-silico experiments in a matter of days for the same cost that a single live experiment takes to run over a period of months.

Of course, simulated experiments suffer from a lack of fidelity relative to biological experiments. One way to overcome this is to run the in-silico experiments in vitro or in vivo to get the most interesting results. Integrating in-silico data from vitro/vivo experiments leads to a feedback loop where results of in vitro/vivo experiments become training data for future predictions, leading to increased accuracies and reduced experimental costs in the long run. Several academic groups and companies are already using such approaches and have reduced costs by 50 times.

This approach of using machine learning models to select experiments and to consistently feed experimental data to ML training should become an industry standard.

Masters of the universe

As Steve Jobs once famously said, “The people who are crazy enough to think they can change the world are the ones who do.”

The last two decades have brought epic technological advancements in genome sequencing, software development, and machine learning. All these advancements are immediately applicable to the field of biology. All of us have the chance to participate and to create products that can significantly improve conditions for humanity as a whole.

Biology needs software engineers, more infrastructure engineers, and more machine learning engineers. Without their help, it will take decades to digitize biology. The main challenge is that biology as a domain is so complex that it intimidates people. In this sense, biology reminds me of computer science in the late 80s, where developers needed to know electrical engineering in order to develop software.

For anyone in the software industry, perhaps I can suggest a different way of viewing this complexity: Think of the complexity of biology as an opportunity rather than an insurmountable challenge. Computing and software have become powerful enough to switch us into an entire new gear of biological understanding. You are the first generation of programmers to have this opportunity. Grab it with both arms.

Bring your skills, your intelligence, and your expertise to biology. Help biologists to scale the capacity of technologies like CRISPR, single-cell genomics, immunology, and cell engineering. Help discover new treatments for cancer, Alzheimer’s, and so many other conditions against which we have been powerless for millennia. Until now.

Luis Voloch is cofounder and Chief Technology Officer at Immunai


Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Go to Source

Continue Reading


‘Cyberpunk 2077’ next-gen upgrade will be free for PS4 and Xbox One owners



‘Cyberpunk 2077’ next-gen upgrade will be free for PS4 and Xbox One owners

CD Projekt Red is “on track” to release the PlayStation 5 and Xbox Series X/S versions of Cyberpunk 2077 in the first quarter of 2022, the studio’s parent company announced on Monday. CDPR had initially planned to release the update in late 2021 until it announced a delay to early 2022 in October. 

CD Projekt also confirmed anyone who purchased the game on either PlayStation 4 or Xbox One will receive the next-gen update for free. Since the game is currently playable on the current generation systems through backward compatibility, everyone who owns the game on a console will get the upgrade for free. If you don’t already have Cyberpunk 2077, you can buy it while it’s currently 50 percent off on the PlayStation and Microsoft Stores and get the game at a discount before the updated version comes out next year.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Go to Source

Continue Reading


Amazon Web Services unveils enhanced cloud vulnerability management



Amazon Web Services unveils enhanced cloud vulnerability management

AWS logo

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more

Amazon Web Services (AWS) today announced several new features for improving and automating the management of vulnerabilities on its platform, in response to evolving security requirements in the cloud.

Newly added capabilities for the Amazon Inspector service will meet the “critical need to detect and remediate at speed” in order to secure cloud workloads, according to a post on the AWS blog, authored by developer advocate Steve Roberts. The announcement came in connection with the AWS re:Invent conference, which began today.

In a second security announcement, AWS unveiled a new secrets detector feature for its Amazon CodeGuru Reviewer tool, aimed at automatically detecting secrets such as passwords and API keys that were inadvertently committed in source code.

The security updates from AWS come as enterprises continue their accelerated shift to the cloud, even as security teams have struggled to keep up. Gartner estimates 70% of workloads will be running in public cloud within three years, up from 40% today. But a recent survey of cloud engineering professionals found that 36% of organizations suffered a serious cloud security data leak or a breach in the past 12 months.

Changing cloud security needs

In the post about the Amazon Inspector updates, Roberts acknowledged that “vulnerability management for cloud customers has changed considerably” since the service first launched in 2015. Among the new requirements are “enabling frictionless deployment at scale, support for an expanded set of resource types needing assessment, and a critical need to detect and remediate at speed,” he said in the post.

Key updates for Amazon Inspector announced today include assessment scans that are continual and automated — taking the place of manual scans that occur only periodically — along with automated resource discovery.

“Tens of thousands of vulnerabilities exist, with new ones being discovered and made public on a regular basis. With this continually growing threat, manual assessment can lead to customers being unaware of an exposure and thus potentially vulnerable between assessments,” Roberts wrote in the post.

Using the updated Amazon Inspector will enable auto discovery and begin a continual assessment of a customer’s Elastic Compute Cloud (EC2) and Amazon Elastic Container Registry-based container workloads — ultimately evaluating the customer’s security posture “even as the underlying resources change,” he wrote.

More feature updates

AWS also announced a number of other new features for Amazon Inspector, including additional support for container-based workloads, with the ability to assess workloads on both EC2 and container infrastructure; integration with AWS Organizations, enabling customers to use Amazon Inspector across all of their organization’s accounts; elimination of the standalone Amazon Inspector scanning agent, with assessment scanning now performed by the AWS Systems Manager agent (so that a separate agent doesn’t need to be installed); and enhanced risk scoring and easier identification of the most critical vulnerabilities.

A “highly contextualized” risk score can now be generated through correlation of Common Vulnerability and Exposures (CVE) metadata with factors such as network accessibility, Roberts said.

Secrets detector

Meanwhile, with the new secrets detector feature in Amazon CodeGuru Reviewer, AWS addresses the issue of developers accidentally committing secrets to source code or configuration files, including passwords, API keys, SSH keys, and access tokens.

“As many other developers facing a strict deadline, I’ve often taken shortcuts when managing and consuming secrets in my code, using plaintext environment variables or hard-coding static secrets during local development, and then inadvertently commit them,” wrote Alex Casalboni, developer advocate at AWS, in a blog post announcing the updates for CodeGuru Reviewer. “Of course, I’ve always regretted it and wished there was an automated way to detect and secure these secrets across all my repositories.”

The new capability leverages machine learning to detect hardcoded secrets during a code review process, “ultimately helping you to ensure that all new code doesn’t contain hardcoded secrets before being merged and deployed,” Casalboni wrote.

AWS re:Invent 2021 takes place today through Friday, both in-person in Las Vegas and online.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Go to Source

Continue Reading
Home | Latest News | Tech | What Norwegians are learning as they pioneer autonomous ships