According to Oil & Gas UK’s latest Tech Insights report, data and digital solutions are finally becoming an integral part of the asset lifecycle. One of the largest areas where they’re applied is facilities management, but what about the design phase? Elaine Maslin takes a look at two approaches.
It’s an area ripe for innovation. According to Norwegian digital company FutureOn’s Customer Success Manager Gregor Deans, digitalization can reduce up to 70% of the engineering hours in field development. Much of that is in lead time reduction in the planning and design phases, with a potential for 30-80% reduction, Deans told the Underwater Technology Conference (UTC), held in Bergen and online earlier this year.
But it’s not just about time to first oil. Dubai-based engineering and fabrication firm Lamprell and Swiss engineering simulation company Akselos have shown that the amount of material needed for an offshore wind jacket foundation can be reduced by 30%, using digital twin technology. Lamprell has been using the same technology to reduce weight in oil and gas foundations by 10%.
FutureOn was spun out of visualization technology firm Xvision in 2014. Its FieldTwin Design product, offered as software as a service (SAAS), is a platform that brings together a wide range of data sets, from bathymetry to reservoir data and well trajectories to pipeline data, into an environment that engineering teams can use.
Lundin Energy has been using the service on early phase subsea engineering work to “quadruple efficiency” and improve the quality of engineering outcomes, says Tom Widerøe, Lundin’s chief engineer subsea, who was also a speaker at UTC.
He says that efficiency has largely been gained by separating data from applications, so that there’s a “single source of truth” for data, and then applying web-based APIs within the FieldTwin environment to run any workflows required.
For early-phase design, where Lundin has a case book to look at different design cases, which can be anywhere from 10-20 to several hundred, this has helped ensure they’re all kept up to date when any inputs change, such as oil price and production profiles.“Until recently, this has been a very manual process, with little time for iterations,” he told UTC. “Similar data is residing within different applications and different companies. Time and costs associated with maintaining and updating this data are significant. However, new tools and new working methodologies are now disruptively changing the way we work with field developments.”
Looking back to 2018, a case would be developed based on data from the reservoir and drilling departments, a whiteboard drawing of a field layout would be created, then there would be a process to see if there’s anything that could be done to optimize it, Widerøe says. Suppliers would also be brought in to support this work. Sketchup (3D design software) would then be used to create a field layout, where the wells are, how long the pipelines are, etc. Then a manual process, in Excel, would begin, to cost the project, as well as work on flow assurance, freespan calculations, rocking dumping requirements, etc., that would be done internally and externally in specific simulations, e.g. Olga, etc.
“It’s a manual process that takes a lot of time and when there are more than 200 case books to look into. It takes a lot of time, so there’s little time for iterations. There’s similar data residing within different applications, data residing at contractors, quality is questionable, it’s difficult to keep track of changes, and you’re reliant on individuals. The data residing within applications at different companies has to be manually updated in each place,” he says.
Using a platform like FieldTwin (which Lundin has supported the development of), the data is separated from the applications to create a single set of data, stored in the cloud, that all the involved platforms and digital tools can access through an API, with automated dataflow across applications and companies, and automated generation of a standardized case book with a full track of history, says Widerøe.
“A significant advantage of that data being stored in the cloud and accessible in the cloud via the API is allowing integration of other programs, so I can easily do flow assurance, or my OrcaFlex calculations through FieldAp (FieldTwin) applications.”
It’s all also visualized and contextualized with imported GIS, bathy, or license maps from the Norwegian Petroleum Directorate, which gives the operator an overview of licenses, pipelines in the area, or any other nearby structures.
“If I change a wellhead location in FieldAP, then the pipeline geometry gets updated,” says Widerøe. Integrated applications, run in FieldAP, are also typically updated on the fly, while external applications, maybe free-span calculations, will download the latest pipeline geometry from the cloud data to perform calculations, and then a new case book is automatically generated, he says.
“We see we’re able to do double the number of cases, using half the time. And due to better transparency and standardization and increased communications, the quality of the work will increase significantly.”
Deans says some challenges are being worked on through industry joint industry projects and workgroups. The Data Liberation front, coined by Aker BP/Cognite in 2018, to encourage easier access to data, is one step, he says, but data socialism is also needed, to make collaboration easier.
And to do that, he says, everyone needs to speak a common language, which currently isn’t the case. Each contractor has its own metadata, and that increases the time and effort needed to integrate the different solutions, he says.
An initiative, called Capital Facilities Information Handover Specification (CFIHOS), under the International Association of Oil & Gas Producers (IOGP) JIP 36, is working to develop specifications to facilitate transfer of information between operators and suppliers, to make this easier.
Also, there’s development of a reference data library to support standard equipment naming.
Another IOGP project, JIP 33, is developing standard equipment specifications and another, DISC, is focused on the adoption of information standards. The OSDU forum is also developing open-source data standard models and platforms, mostly around well and reservoir data to date, says Deans.
This year Equinor and Total (now TotalEnergies) have been working together to align their internal metadata sets with external tools and contractors, he adds. The final agreed subsea metadata library will be added to the FieldTwin platform and it’s expected that other operators’ metadata libraries will be similar and the goal is an industry standard meta data library to submit to CFIHOS. That’s work in progress…
Akselos & Offshore Wind
Another approach to the digital twin is Akselos’ physics-based simulation technology. The technology, licensed from MIT, is a reduced basis finite element analysis (RB-FEA) software based on algorithms that the firm says are 1000 times faster than those typically used in conventional FEA analysis. Unlike FEA, RB-FEA can also provide detailed structural integrity simulations of entire assets, the firm says.
Akselos was founded about nine years ago and initially focused on integrity management through a software-as-a-service (SAAS) model, with venture capital funding and support from Shell.
But it’s also been looking at design engineering, helping to make fatigue calculations, across entire assets, faster and better.
John Bell, SVP at Akselos says that FEA, the primary tool for testing strength in an asset, hasn’t changed for about 40 years and that it is always a compromise; if you want high fidelity it can only be done at small scale, as it’s generally not able to compute detailed models of full systems, and that if you want a full system analyzed, it has to be done via lots of smaller models, leading to layers of design conservatism in the model. It’s also slow, which is especially an issue for supporting operational decision-making.
Akselos’ RB-FEA is based on reduced order models; numerical methods to reduce the computation required for evaluating high-fidelity models, such as FEA. This is complex stuff – Akselos’ RB-FEA algorithms, based on a component-based version of reduced basis methods, took MIT about 12 years to build, under funding from the US Department of Defence, which wanted better – faster – ways to understand the structural integrity of its ships in a live battle situation.
The parameterized components which are connected to form a model make the computation much faster than traditional FEA because fewer numbers have to be crunched.
“Pre-analysis, and a component-based approach, allows us to significantly reduce the amount of data we need to calculate,” says Michiel van Haersma Buma, VP for Customer Success at Akselos, “accelerating each solve by orders of magnitude.”
That makes it faster and also scalable to well over a 100 million degrees of freedom, he says, adding that “there is really no other technology capable of handling those levels of complexity, so we genuinely believe this will change the game in both Design and Operations.“
Any changes to the components – material properties, loads, geometries – can be quickly resolved in the model, making design iterations faster and across an entire asset, such as a turbine, not just the blade.
UAE-based Lamprell has seen this as an opportunity. The company has been working in offshore wind for some time and currently has a contract for 30 offshore wind turbine jackets with suction bucket foundations on SSE’s Seagreen 1 project off Scotland, each of which will host 10MW MHI Vestas turbines.
Lamprell COO Hani El Kurd says the company started on its digitalization journey by digitalizing existing processes, automating things like welding in its yards and using robotics.
It’s then started looking at design optimization, as part of efforts to help reduce inspection requirements. The time it took to run simulations was a limiting factor, he told a webinar run by Akselos, limiting how many iterations to change and influence concept designs could be done. RB-FEA has overcome that challenge and they can also look more at constructability. Traditionally, a design for an offshore wind jacket will take nine-18 months, but that could be radically reduced, by half or less, even, says Van Haersma Buma, with RB-FEA.
Sabih Laham, VP of engineering, also says the previous methodology was siloed with different analyses being run at different levels of the design and with full model revalidation requirements resulting in a “laborious” process and uncertainty. In comparison, RB-FEA is allowing them unrestricted degrees of freedom to be simulated in a single model at speed without needing high amounts of computing power. So far, Lamprell has been reducing the weight of an oil and gas foundation by 10% and the steel input for offshore wind turbine jackets by up to 30%, the firm says.
Akselos itself is also now working with Shell and RWE to model the Stiesdal design TetraSpar floating offshore wind demonstrator, which was recently installed offshore Norway. In this case, the structure model will include live data from the installed structure, so it can be analyzed in near real-time to support future design improvements and optimization.
This work follows a project on Principle Power’s Windfloat Atlantic project.
There is more that could be done from an industry perspective, if the data was available. However, it can be hard getting, for example, the required turbine data, which means a design has to ping back and forth between the foundation designer and turbine manufacturer, lengthening the process, says Van Haersma Buma, who joined Akselos in 2021 after supporting the company through Shell Ventures.
“If we had access to detailed turbine data earlier in the offshore wind design cycle, we could start the foundation optimization process early and reduce project uncertainty for an offshore wind developer by order of magnitude,” he says.
This post appeared first on Offshore Engineer News.