The Houston Professional Petroleum Data Expo was held last week, and the attendees were certainly excited to interact with each other, many of them attending this event as their first in-person event in nearly two years. There were 40 different presentations, one of which was the keynote, and three of which were presented as case studies by operators. So, while only 10% of the talks were from operators, the content was generally good with a few standouts as expected. The general theme, regardless of the speaker, was consistent. Data Management, quality data, solid processes and buy-in from all business stakeholders remains a key to the success of the business.
With the continued explosion in Data Science, Digital Transformation initiatives, data engineering projects, in combination with reduced headcounts, lower operating budgets, and higher than ever demand for data, energy faces a conundrum. The number of resources that have left the industry completely in the past year, not to mention the tumultuous years since the middle of 2014, is overwhelming. While operators have been hesitant to eliminate geoscience professionals in their key operating areas, there has been an overwhelming number of reductions in the support functions those geoscientists rely on. Much of the historical knowledge now resides in a very small pool of internal resources inside the operators, and this lack of SME’s as the operators continue back to their prior levels of operations is driving up demand for data management services as well as transformative initiatives to continue operating lean and mean into the future.
Operators are now looking at the end-of-year funds as well as their budgets for 2022 and are looking at ways they can clear the hurdles before them. Many are looking to software solutions, the number of which were high for this event both in terms of sponsors as well as breakroom vendor booths and vendor presentations. Some are relying on data experts, several of those were also present in the form of attendees and presenters alike. One thing which felt truly promising was the number of clients (and future clients) who were excited to see Sword with a presence in the US. While Sword has been very active in PPDM both in Europe and Australia, Sword’s move to start operations in Houston late last year was also a welcome sight for the data management folks in attendance. Clearly, the need is high and we look forward to facing that need head-on as we continue to show the industry what Sword is all about in this space.
Two of the operator presentations, one from Apache and another from Murphy, both exemplified the needs and the shortcomings. Both Operators have gone through major organizational shifts over the past year, both see the need for good data practices (governance, management, stewardship, business alignment, etc.), and both realize that while they have made incredible strides, the road in front of them remains long and bumpy. Chevron also spoke about their journey but focused on the “What Is A Well” (WIAW) initiatives they have tackled. WIAW has been one of the most successful educational tools in the industry over the past several years, and the work PPDM have done with it, along with its committee members, data vendors, software vendors, and operators, has been amazing. In 2008, the US started hydraulic fracturing as a majority of its onshore operations, and with it came an entirely new focus on the well, the wellbore, and the process of drilling, completing and producing shale wells. WIAW brought that entire process into the light for everyone to understand and work toward agreeing on for use in their internal systems as well as conferences such as this one. A common language that is understood by all makes the process of communicating business problems and solving those problems so much simpler.
It's hard to look back with anything less than amazement at where Sword was in Houston one year ago today…while over 2,000 people are working inside Sword Group today, hundreds of which are in the Energy industry, there were exactly zero Sword staff in Houston at this time in 2020. As we close out 2021 with this PPDM event, we had 4 in attendance, 3 presenting, and 15 total staff in Houston. As we continue to grow in the region, we expect to be hearing from more and more clients as we head into 2022 with a successful PPDM event behind us, and a doubling of our Houston presence in front of us. What an exciting time for Data Management in Houston!
Thanks, PPDM for a successful event and for allowing Sword to play a role in it.
Rob Gibson is a Data Specialist and Technical Solutions Manager for Sword, based in Houston. With over 25 years of technical experience in Oil & Gas, Rob works with our customers in building solutions to their data challenges. Rob has been a long-time member of PPDM including both sponsoring and participating in a number of workgroups over those years. For further information or to contact directly, please email Rob.Gibson@sword-group.com
Sword sponsored last week’s PPDM European Data Bytes Session, featuring guest speaker Neale Stidolph (Sword’s Head of Information Management) talking about some examples of Sword’s work in data management. The event was well attended and set a new record of participants from 15 countries, sparking debate and discussion. Dan Clarke also attended and gives his thoughts on the session.
The event kicked off with speakers Jonathan Smith and Paul Smith from Interica and Petrosys, presenting their collaborative tool to link structured data to unstructured data. It was an interesting presentation where we learnt about the tool’s ability to link multiple silos, systems and technologies together to present a clear view of the data within these. The following discussion centred on handling legacy data and the OCR of the poorer quality scans. This is a significant challenge for our industry which will need to be solved if we are to tap into the information held within the legacy data sitting in the NDR, hardcopy storage and on file shares. Our ability to support MER and decommissioning is dependent on solutions in this space.
Neale Stidolph, Sword’s guest speaker discussed the diversity of data management work, from subsurface to engineering and corporate data. As the energy sector changes there have been numerous mergers and acquisitions, and the inclusion of renewables in energy transition. This has amplified the need to handle very large amounts of data, migrating it and dealing with new data types – all with the necessary data governance.
The final presentation was from Tanya Knowles of the OGA, who spoke about GIS and the various data portals on the OGA’s website. As part of the Digital Services team, Tanya looks to present the pipeline of data to the OGA and external users. The OGA has various data portals for a variety of different data types, including licence rounds and production information. The OGA promote the use of ML and AI and they intend to incorporate the use of these into their future apps and portals which will include offshore activity, open data and APIs. It is great to see the OGA making data available through these data portals for the industry to access and use, and we’re proud to be supporting them on this journey.
The PPDM regularly holds its Data Bytes forum, where data managers come together to facilitate collaborative idea sharing, discussion, and networking. The PPDM meet again on 10th June for a special celebration in honour of the Association's 30th Anniversary.
Dan Clarke is a Data Specialist and Business Development Manager for Sword, based in Aberdeen. With over 15 years technical experience in Oil & Gas, Daniel works with our customers in building solutions to their data challenges. Daniel joined the PPDM UK Leadership Team in 2018 and has supported a number of events in Scotland and elsewhere. For further information or to contact directly, please email on Daniel.firstname.lastname@example.org.
In this deep-dive article from one of our Principal Data Engineers, Stephen Connell, we get under the skin of how to make the most of data connections in Microsoft Azure Synapse Analytics Workspace. Stephen uses screenshots to explain his approach and outline why minimising the number of integration datasets is important when developing data solutions.
In a Synapse Analytics workspace, connection to data is made using two components: a Linked Service that acts much like a connections string for a data store, and one or more Integration Datasets that connects to data within the Linked Service as a named view of that data.
Like most things in Azure Synapse Analytics workspaces, creating Integration Datasets is incredibly easy, intuitive, and quick. They can be created on the fly when you are building pipelines or Mapping Data Flows. This ease of creation and the speed of creating them can lead to added complexity to your workspace environment and what seemed like a fast fix can end up slowing down development and making your solution less manageable.
In this blog, I explore why minimising the number of Integration Datasets is a good idea and what patterns we can adopt to minimise their number.
Figure 1 a Linked Service to on Premise SQL Server
Linked Services connect to a specific store in a specific database, for example on a server. In the case of an SQL Server, a specific database must be provided as part of the Linked Service. The relationship between the Linked Service and its supported Integration Datasets dictates that tables or views from that specific store can be generated for a specific table or view. This can lead to a scenario where data from multiple tables and views are accessed from their Integration Dataset.
Consider a database of the complexity of AdventureWorks2017 with 80+ tables and a couple of dozen views. That could require a vast number of different Integration Datasets which could lead to a high degree of complexity for data engineers in Synapse.
Figure 2 Selecting the Dataset for a copy task.
In this example, I have 19 Datasets. This can make finding the correct one to use difficult when the filter options available only go so far.
Figure 3 Filter on Source Dataset
When considering how we might simplify the approach to Integration Datasets, it helps to consider how these are used within Synapse Analytics and how we might be able to utilise the functionality to our advantage.
Integration Datasets are used in Pipelines and Dataflows. In both uses, there are capabilities that allow us to modify the specified source or destination for our data. This will change depending upon the nature of the data store. I will consider two approaches: an RMDBS – this example SQL Server, and Data lake Storage. In the latter case, there is a complication in that not only does the Integration Dataset store the location, but also the type of document that will be created. This is stored within the code of the connection e.g. "type": "DelimitedText" for CSV files or "type": "Parquet" for Parquet files.
Connecting to source data within Azure Storage, e.g., the default Azure Data Lake Storage Gen2 name [WorkspaceName] – WorkspaceDefaultStorage, allows for the Integration Datasets that point to specific folders and files, while also permitting the use of wildcards and overrides. Let us consider a parquet-based Integration Dataset that points to high-level storage.
Figure 4 Parquet Connection with only Container
In the example shown, this Integration Dataset has no Directory or File specified. When this is used in a Copy Data Activity in a Data Factory Pipeline, the activity source settings allow you to specify the location to load data from, by using Wildcard details.
Figure 5 Source Wildcard file path details
When it is used for Data Flow Source it is similar to use Wildcard paths to specify the location.
Figure 6 Wildcard Paths in Data Flow Source Transformation
For Sink activity settings or Sink Transformations, the scope for Wildcard paths is not quite as straightforward. In both Copy Data Activities and Data Flow Sinks, the destination can be set in the pipeline but only where a parameter is used. Consider an example: we create a Parquet Integration Dataset however in this instance instead of relying upon wildcards we add a parameter; see below.
Figure 7 a Parquet dataset with a Parameter
Figure 8 Using a Parameter for a File path
We can use this parameter to form part of the path of the file locations, and when this parameter is present within the Integration Dataset, it will be presented for inclusion whenever it is used in Copy data Activities or Mapping Data Flows.
Figure 9 Copy Parameterized Sink
Figure 10 Data Flow Parameterized Sink
This would be true of sources if used in this manner. Supplying a default value is allowed and there are pros and cons with doing so. It does mean that when using this approach for a source and utilising Wildcards there is no need to populate the parameter, however, it does mean that it is always populated and it is possible to miss occasions when it has not been changed to point to the correct location.
Database datasets offer other options. The first thing to note is that the Wildcard Path option is not present, however, this does not rule out the ability to parameterise our Integration Datasets. Indeed, there is more scope to do so. We can separately specify values for the schema and the table name. This approach is available for both Data Flows, however, it is important to note that On-premises Datasets using Self-Hosted Integration Runtime connections are not available for Data Flows.
Figure 11 RMDBS Parameterized Dataset
In addition to using parameters, it is possible with database connections to supply a Query or the name of Stored Procedure. In the case of supplying a query, it is possible in some cases to supply the data from another database contained within the same server!
Figure 12 AdventureWorksDW query on an AdventureWorks Connected Dataset
When using Stored Procedures, you can supply parameter values within the Copy Data Activity and Dataflows. See here for more details from Microsoft.
Because we have different ways to specify the inbound and outbound locations of our data, the need to include a new Integration Dataset for each table and each storage location folder becomes weaker. In a review of the considerations, we can see:
Given what we know above, I would recommend a pattern where:
Consider the data you are processing and how you will be working with your data stores. If you are only reading from on-premises SQL for example, you might just need one dataset, i.e., a non-parameterized connection.
Work with your team. Make sure that you are on the same page as each other and that no unnecessary connections are created. Reducing the number of Integration Datasets will simplify your design and make sure you are connecting to the right place more often.
Stephen Connell is a Principal Data Engineer at Sword based in Glasgow. Stephen has over 20 years’ experience developing solutions for Data and BI Reporting for a range of clients in business, government and the third sector. Sword specialises in data, IT and digital support to several industries including Energy. For further information or to get in contact, please email email@example.com or call directly on +44 (0)131 300 0709.
That’s a key question in oil and gas right now and it brought 30 of us from different backgrounds together in a joint PPDM and WADSIH meeting to answer this in one of the many DataScienceWeek events unfolding across Perth.
Firstly, you’ve got to narrow the scope of this question if you want to reach any kind of meaningful answer in a 2.5 hour session. With an audience covering stalwarts through to start-ups, we took analytics as the test case for discussion. But wait, what do we mean by this? ‘Analytics’ covers a range from the descriptive ‘what happened’ (a yes vote from the data managers in the room), through predictive ‘when will it happen next’ (yes from the data analysts) to cognitive ‘what can we learn from it’ (the data scientists).
With an idea of how we plan to use these data, we addressed quality dimensions using parameters that will be familiar to many. Is our data complete, unique, timely, valid, accurate and consistent? Which of these matters? Long story short… it’s whatever you need to use data confidently to support the business challenge you need to resolve. Which is where the metadata discussion kicked off: what’s the data description you need to understand your data and how do open schemas such as Dublin Core compare to the application-native options we typically use? Many of us in the room had experience of forcing data into various vendor models, then reloading these same data from source as we cycled through products from different vendors – which I think provides the answer! Resolving this is part of the premise behind OSDU which has the potential to be a step-change for how we work with data in our industry.
We took a quick sidestep into what do you do when metadata is not available to support the data you want to use? Risk it and gain insight at lower confidence, or play safe and maybe miss the extra benefit? Again, it depends on what you want to do – with this nicely demonstrated against an O&G value chain starting at Exploration (low quality data, high quality data? Gimmie it all!) and finishing with Decommissioning (how do we make this safe AND convince others that is what we’ve done). All reasonable, but then how do you manage the chain of custody of data to ensure that data is not used for purposes for which it is not fit? Appropriate metadata is the answer.
Break-out sessions tested these quality dimensions across scenarios including legacy and new G&G data types, healthcare and safety case documentation. Common to all of these was the need to balance an upfront investment in time and effort against uncertain future use cases for the data, with the party realising the benefit not always the same one taking the initial cost hit. Lessons from outside of O&G were particularly acute with the cost of acquiring one data type being a small proportion of the costs of subsequent storage – which is the opposite for our expensive well and seismic data.
With the relative knowns of quality and metadata addressed, we moved onto the unknown biases that hide in data and interpretations, ready to catch you out the moment you forget about them. In an industry that looks for the few formations and structures containing hydrocarbons, we love to eliminate outliers through the use of averages, means and medians. With limited availability of data, training datasets and sampling are not always representative. (Hint: lobby your regulator to open up their data store if they haven’t already done so). Data that fits our interpretation is always ‘right’ and exceptions get excluded… What’s the lesson here? As a data manager, I need to accurately and precisely deliver the data as it was acquired; as a data scientist I need to challenge the data that is presented to me.
A lot of the workshop content was familiar but that’s no criticism – it’s healthy to revise and continually test our views as the business challenges we’re supporting and the technology available to help us are evolving. Jess Kozman did a great job facilitating and there was active participation across the board. However, the number of attendees was on the low side – lingering covid-caution or perhaps signalling the reduced numbers of data professionals still working in oil and gas, and a concern from those with jobs to be taking time out from their role to learn and grow? This must be addressed by an industry with an ever-increasing focus on data to help it respond to external pressures. Improved recovery factors, lower lifting costs and exploration efforts focused on established basins mean we need to access, understand and work our existing and new data extensively. Data professionals and the support we provide have a valuable ‘multiplier’ effect on the efforts of the consumers that we support, whether they are geoscientists, engineers, analysts or algorithms.
Back to the original question – is our data fit for purpose? The only way we can answer that is by inverting the question: to what purpose does our data need to be fit? This requires continued engagement with business and technical consumers of data; awareness, adoption and use of technology; and on-going assessment and extension of our value proposition. We can no longer predict all the ways in which our oil and gas data will be used, but that’s no excuse to not try!
Thanks to Jess Kozman for facilitating, the PPDM West Australia leadership team volunteers for arranging and to CORE Innovation Hub for providing a great workshop facility.
About the author
Neil Constantine is a Business Unit Director and Data specialist at Sword, based in Perth. Neil has 25 years’ experience, working in both operator and service environments. Sword specialises in data, IT and digital support to a number of industries including Energy. For further information or to get in contact, please email firstname.lastname@example.org or call directly on +61 (0)426 240702.
Asset Transfers can be complex from an information and technology perspective but can be the springboard for innovation, as well as delivering the expected productivity. The critical path is for the transfer of services agreement, safety case, readiness, and transition of operations. This article highlights the types of transfer, the issues we face, and Sword’s tried and tested approach to dealing with them.
The divestor may not have detailed plans of what to transfer or how to do it, and the acquiring organisation may have limited experience of transfers. The team at each end may be undertaking this work on top of their current jobs, with timescales that are often outside of their control. This creates demands that are stressful for key personnel and impacts their ability to quickly make critical decisions.
The impact on contracts and procurement is often underestimated. Limited time for tenders and approvals for expenditure, as well as a lack of expertise in the contract scope and novation, all need an appropriate strategy. Sword helps drive that efficiently as we coordinate a range of vendors, use specific templates from our procurement catalogue, understand the technical scopes, and can validate pricing. We understand the importance of the early identification of contracts that cannot transfer, or custom and out of date systems that will need replacing. Watch out for surprisingly long-lead time items such as radio licenses, network circuits, software customisation, and offshore deployment.
The UKCS landscape is evolving. Supermajors are changing their portfolios and mid-tier organisations seek opportunities, but the most dynamic area is new entrants. Rising prices are increasing investor appetite, but this is likely to be balanced by the Energy Transition and ESG pressures.
The approaches to an asset transfer take account of the practical differences in systems maturity and capability between companies of varying scale. The challenges faced reflect the different types of transfer; from a non-operated asset, unmanned platform or subsea production system, to a major platform or FPSO. The timescales, and corresponding project and capex costs, can range from one or two months at one end of the scale, to six months or a year at the other.
The primary objective is always hit the transfer deadline. That creates the mindset of “move everything over ‘as is’, start using it, and then worry about the details later”. Transfer projects can involve dozens of systems and huge amounts of information, paper, tapes, samples, and other data. There is a clear need to maintain the Chain of Custody to govern how information changes hands. Not all systems are appropriate for Cloud technology, and that certainly won’t help with physical archives and samples. On-site systems take longer to deploy, and private datacentres can have short-term capacity limitations.
Don’t try and implement radical innovations unless you have proven methods and funding already in place. Post-transfer project fatigue can set in, creating a risk that the performance the new owner seeks will not be realised. Leadership and investors will expect increased performance so plan for it. Make sure corporate information, engineering and subsurface are all taken care of. This ensures you can apply your people and financial resources to make future modifications to your production assets and make the most of your reserves. Sword’s approach consists of 3 separate phases:
Sword has successfully completed many asset transfers and related services on behalf of a range of UKCS companies including NEO Energy and Serica Energy alongside the best-known Operators in the industry.
Contact Neale Stidolph (Sword Energy Sector, Information Management Lead), email@example.com
Sword IT Solutions has embarked on an exciting new chapter with the creation of the Sword Energy Sector. This approach underlines the commitment to placing the needs of energy industry customers at the heart of the organisation, by bringing together Sword’s full breadth of domain experience, digital enablement, and managed service excellence. Building an Energy Sector is a natural next step for this business technology and data company, which has doubled in size over the last 5 years and has the ambition to double again by 2025.
Despite the past 12 months being recognised as the most challenging in recent memory, Sword has created 50 new roles in Energy while protecting existing jobs. Sword’s goal is to create at least 250 more positions within the Energy Sector in the next 3 years with continued investment in new talent, training, and skills development.
Sword’s Energy Sector has over 350 employees in locations across the UK, Europe, Asia Pacific, and North America. This global structure enables Sword to support customers of all sizes, maturity, and location, ranging from international blue-chip companies to local independents. Sword places critical importance on partnering with customers as ‘Trusted Advisors’ and applying an in-depth understanding of business needs. In doing so, Sword draws on extensive consulting and technical expertise to create a platform for future operations and innovation by delivering Digital, Data and Infrastructure Services.
Phil Brading, Sword Energy Sector Director, explains:
“We are shaping our business to make sure we are best placed to support the evolving needs of the energy industry. It’s important that we integrate our data and technology expertise from across the asset lifecycle to support our customers’ digital strategies and drive greater value. Our success is built upon listening to and working with our customers, and that ethos remains as fundamental as ever.”
Sword is delighted to announce the appointment of Helen Ratcliffe in the role of Technology Consulting Lead for the Energy Sector. Helen will be helping to build the next phase of Sword’s digital technology strategy to best meet customer demand for the next generation of specialist advice, projects and services.
Helen has over 30 years of experience in the Energy industry, predominantly in Upstream Oil & Gas. She has a great track record of delivering technology-enabled, business transformation programmes that include Mobility, Big Data, Digital Oilfield, Integrated Operations, and Organisational Change Management.
Helen Ratcliffe, Technology Consulting Lead, said:
“I am excited to join Sword as we embark on the 2025 strategy. Each individual part of the Sword Group has been successful in it’s own right, but I believe the Energy Sector will be greater than the sum of its parts. I am keen to work with our existing clients and bring in new ones with whom we can partner to drive positive business outcomes for each.”
The focus from Phil, Helen, and the wider Sword leadership team, will be to expand upon Sword’s foundations to develop flexible new services that address the challenges the industry is facing. This means applying a data-driven approach that promotes the adoption of best practice, efficiency and new insight.
Sword delivers practical business support to our customers in Energy. Spanning Oil & Gas, the Energy Transition, and Renewables, we provide day to day operational support, digital solution deployment, transition projects, and technical consultancy. Our expertise runs from IT, business technology, and infrastructure, to subsurface, production, engineering and corporate data and information.
Sword IT Solutions is part of the international Sword Group, which, with over 2,000 domain experts, is active in more than 50 counties and specialises in software and services across a range of industries.
Today we’re shining the spotlight on recently appointed, Sector Technology Consulting Lead, Helen Ratcliffe who makes up an integral part of our Energy Sector leadership team.
Tell us a bit about yourself and your journey before joining Sword?
I have been really privileged in my career to work with some of the world’s largest Energy companies, IOCs, NOCs and some clients in the supply chain as well. I have experience in Pharmaceuticals and have provided services to a very large busy airport. My employers have ranged from large technology focused systems integrators to boutique consultancies. For the past 20 years I have focussed on Consulting; this has provided me the privilege of seeing many different organisations from the inside, helping me experience a wide view of their businesses. I have put my effort into a range of technical and business focussed initiatives, always with business outcomes for the client at the forefront.
Looking forward to the rest of this year, what big issues do you expect to help our customers address?
Digitalisation is on everyone’s mind, some organisations are advanced and have a great track record already, others are just embarking on the journey. Bringing together strong consultative skills, the fundamentals of data and information, deep domain and experience, I think we can help move the oil and gas industry forward.
Why did you choose to progress your career with Sword?
Sword have a great set of capabilities, skills and a strong track record over many years in Oil & Gas as well as Utilities and Public Sector. Why wouldn’t I want to work with a company that couples that with a great growth record and values that I really believe in. Sword really care about client’s businesses, ensuring that our highly valued employees all contribute towards keeping things simple and doing the right thing by our customers.
You have joined Sword at an exciting time for the business, what services and specialties do Sword provide in your region to energy customers?
Sword delivers practical business support to our customers in Energy. Spanning Oil & Gas, the Energy Transition, and Renewables, we provide day to day operational support, digital solution deployment, transition projects, and technical consultancy. Our expertise runs from IT, business technology, and infrastructure, to subsurface, production, engineering and corporate data and information.
On International Women’s Day, what is the most important message you want to send out to women thinking about their careers in the Energy Sector?
Over the past 35 years I have seen the Sector move, slowly at first and more rapidly over recent years to address diversity. The Energy sector has re-invented itself; it is no longer all about the heavy industry aspects often portrayed in media and film. While this aspect is still a reality for key parts of the industry; science, technology, innovation, and business acumen have come into public focus. The Energy sector needs new talent and Women can play a major part.
What does the International Women’s Day 2021 slogan, #ChooseToChallenge mean for you in your work life?
Diversity of thinking in any team is very desirable. Don’t be afraid to share your thinking, sometimes people will agree with you, sometimes they will not. If you don’t give them the chance how will you know!
Find out more about International Woman’s Day 2021.
Sword is working with Serica Energy and the Datum360 Connected Data Software platform, using their combined expertise and technology to increase the integrity of engineering information for the Bruce asset, one of Serica’s UK Northern Sea assets.
On January 27th, Sword, Serica Energy and Datum360 delivered a 30-minute webinar, demonstrating how engineering and asset data can be leveraged and connected, day-to-day on a brownfield asset.
Find out more and watch the full 30-minute webinar by registering below.
When Serica Energy acquired the Bruce, Keith and Rhum assets in 2018, Sword was engaged to build the IT services they needed as a new North Sea operator. Andrew James led the IT transition team and subsequently took on the role of Serica’s Information Services Manager, responsible for Sword’s small but perfectly formed managed services team. In this article, Andrew explains what’s happened since 2018, as Serica enters its third year as operator of BKR, and how Sword is helping the company on its digital journey.
During Serica’s acquisition of Bruce, Keith and Rhum, IT services were built in readiness for cutover with the focus on satisfying essential business needs. It was important that core services and business systems were available on “Day 1” to support key business functions and ensure operational continuity. The principle of “Transition Now, Transformation Later” was observed during this phase.
With core IT systems in place following cutover, we entered a stabilisation phase when we made sure everything was working correctly, applied small fixes and enhancements, and introduced some new systems to improve ways of working. A lot of this work was opportunistic or reactive, based on short term or tactical business needs.
As we approach the end of the second year of IT operations, we are entering a more strategic, transformative phase when we will make more structural changes to how the company maximises value from digital solutions. In doing so, we have two main aims:
To take us forward, our digital strategy and roadmap will define improvements to be made on a rolling 3-year basis. Recognising that a digital programme is not just about introducing new IT systems, this will embrace four elements:
Digital is all about bringing these different elements together, working out how to use information, applications and technology to enhance business processes and achieve better performance. To do this consistently across the organisation and make sure we don’t miss opportunities, it will be important for all of Serica’s business teams to be involved to some extent. With this in mind, we have formed a Digital Steering Group including representatives from the IT function and all business disciplines.
The steps we have taken to date have allowed us to frame opportunities, explore benefits, agree our priorities, and deliver some quick wins. First, we recognised that the data which is available in many systems can provide a good insight into business performance and help us to identify improvement opportunities. We kicked off a project with three aims:
As the data hub evolves, we will extend this to enable more advanced analytics capabilities. It has already proven extremely valuable, and our dashboards based on Power BI are used by business teams on a daily basis.
Another theme we are pursuing is to put mobile technology into the hands of mobile workers, allowing them to access the information and digital tools they need without being tied to the office. We’re beginning with a new tool to improve how we perform inspections of electrical equipment offshore – an important and resource-intensive activity – and we will move onto other uses cases in due course where we see opportunity to add value or work more efficiently, streamlining how we work across many areas through better collaboration and more direct access to tools and data.
Since data is so fundamental to digital solutions, we are also focusing on making sure enterprise data is well managed, with strong data governance controls defining where master data resides, how it is updated, how it is used, and how it can be accessed. This will allow us to integrate key systems to a much greater extent – maintaining data quality and consistency across business “silos” – and develop digital workflows which are driven by processes and data. An early priority has been improving our engineering data warehouse, allowing us to feed trustworthy data from there to other systems used for transactional business activities.
Although it is still early days, we have already seen potential benefits of enterprise data governance with our adoption of GDi Vision, an asset visualisation solution. The core product allows us to perform virtual walkthroughs of the offshore asset based on geospatial and photographic data. We are now looking at augmenting this with data from our engineering and maintenance systems, allowing engineers to plan work based on a digital twin, reducing the need for costly and time-consuming offshore surveys.
There are many exciting ideas and demands across the business, and our digital strategy and roadmap will help us to make sure we focus on the right opportunities, working hand-in-hand with business colleagues. Digital solutions will play an important role in Serica’s future, enabling the company to safely reduce operating costs and generate value by innovating, and we are looking forward to helping to deliver those solutions.
To learn more about the original asset transition project, read the full Serica case study and for more on the considerations of IT and IM, Operations and Finance and Board on Asset Transition, please follow the links.
Sword are delighted to announce the extension of our Services Agreement at bp and excited to introduce Douglas Frisby as our Business Unit Director for US operations, based in Houston, Texas. The agreement extends Sword Venture’s long-standing partnership with bp and will continue to ensure the provision of value-driven data and information management services to all bp locations.
We look forward to this next phase of our work together, in particular the data challenge that supporting net carbon zero presents, and are thrilled that Douglas will join our Leadership Team to drive our simplified, standardised and insightful managed service approach in Houston. Starting his career as a Petroleum Engineer, Douglas has gained extensive energy industry experience, including leading bp subsurface technical teams across four continents, collaborating with the wells organization as the leader of bp’s Global New Well Delivery community and building capability as bp’s Director of Technical Development, Upstream Talent and Learning. Our Houston office will complement existing service provision delivered from our London, Aberdeen, Rijswijk, and Perth hubs.
Sword’s services deliver daily operational support, digital project expertise, and data-centric solutions to bp across the exploration and appraisal, development, production and engineering, abandonment, energy transition and corporate domains.
Phil Brading, Sword’s Business Unit Director, says “We are tremendously proud to both welcome Douglas and renew our agreement with bp at this pivotal moment in our industry. The combined influence of digital and the energy transition is having a profound impact on the way our customers do business and we are here to support that change.”
Dave Bruce, CEO, Sword UK, says “On behalf of everyone in the Sword Group, I would like to thank our Venture team for this significant achievement. The extension of our Services agreement with bp is fantastic news and has only been possible as a result of the hard work and dedication shown by our delivery teams over many years. The renewal of this agreement and the opening of our office in Houston is an exciting time for us and demonstrates our continued commitment to delivering quality global services to the Energy sector.”
We talked to Neale Stidolph, Head of Information Management and Energy Transition at Sword, about the considerations for IT and information management (IM) when kicking off an oil and gas asset transition project.
What are the key IT and IM considerations you would advise a potential buyer looking to acquire an asset on the UK Continental Shelf (UKCS)?
“People often say, “we’ll take all the information” and set up the systems to use it, that should be quick. Information is often buried in many systems, so the challenge is in getting clear definitions and being able to separate and extract that information. Being able to show you have all the right critical information is a key part of making the safety case and getting regulatory approval to operate the assets.”
How complex is it to extract the data and information on a particular asset out of the owner organisations’ clutches?
“Many applications are old and highly customised, or bespoke. Setting up new systems can be quite rapid but getting the data in the right form and making it useful takes time. There can be permission issues, with multiple data handovers and that all needs to be properly controlled between seller/buyer and any intermediaries – which is where we specialise.”
How long would you anticipate needing to transition the IT and IM on an asset from one owner to another?
“The nature of the assets is key. Small unmanned platforms could be done in a few months, but large onshore facilities or multiple major offshore assets could take a year. Some transition activities can continue post-transfer of operatorship, and many can run in parallel. We always work out the critical path, define long-lead items or systems and work around the buyer/seller timetable.”
What tasks tend to surprise new owners in terms of timescales when project planning asset transitions?
“It can be the non-technical challenges, such as contracts & procurement, with novations, bids and license transfers all extending the timeline. Asset transfers will usually involve dozens of systems, about 50-80 for the last two projects. That requires lots of vendor co-ordination, which we can smooth out.”
“Approvals to act from buyer/seller management are crucial, and there will be many decisions to be taken during the transfer process. The nature of the information can be very broad, from emails to databases, archive boxes, core samples, old magnetic tapes, etc.”
“You may start by working the way the previous owner did, so that is a common initial goal. However, as a buyer you are likely to want to change over time to work in your own way and do things differently.”
Do Sword ever act on behalf of the buyer and seller of an asset, and does that lead to any conflict of interest?
“It helps if we handle transfer activities for both buyer and seller as it makes the communication and data handover issues more efficient. This worked very well on the most recent transfer of GP3 and associated assets & licenses from Total E&P UK to Neo Energy.”
“The conflict of interest aspect tends not to be a problem since we are achieving the same objective for all parties and have a clear and documented processes for managing the transfer, with an audit trail of activities and approvals.”
What opportunities for change do you see new asset owners embrace?
“Having overcome the first hurdle, acquiring the assets and putting in place the data and systems the real work begins! New owners will initially focus on running things safely, maintaining production and checking on things like maintenance backlogs.”
“The future strategy will depend on the projected life of asset or field. Ways to extend the life will be very important and seeking to increase recoverable hydrocarbons so the challenge is in two parts, run production as smart as you can and looking at the subsurface to possibly reinterpret old data, new seismic work and drilling.”
“Where we can help most is in making the right information available more easily, so that it provides solid decision support in those areas. Reducing cost through fewer people offshore, simpler and slicker work processes and making engineering management of change easier – as this may include energy transition such as offshore wind & platform electrification or the potential for CCUS and Hydrogen production.”
Can you give us any examples of cost savings that you have seen new owners of assets achieve by changing the structure of the IT and/or IM approach?
“Yes, such as the trials of robotics for inspections and integrating drones into the client’s data systems. We have also seen improvements through increased safety, by making sure all asset information is ‘as-built’ and ‘as-operated’.”
“The back office processes have also been improved so that business administration is less burdensome. There is usually potential for more consolidation of systems, so that you need fewer applications and their associated support and license costs.”
In a nutshell, the notion that it’s possible to ‘lift and shift’ the IT and IM from one asset to another is sadly only a myth. The reality is that it’s worth investing in a realistic, integrated project plan to make the most of the change in structure. Changing an asset’s ownership gives opportunities to migrate only the relevant information into a new home, and streamline technology requirements that set up the asset for new ownership and growth well into the future.
“A successful asset transfer needs to consider not just the information that you think you need now, but the information which may be of high value later. This includes design information that could make decommissioning easier and old well data such as cement bonding logs – as they may be needed for CCUS. Make sure you don’t lose things of potential future value. Also know that the seller may not give you everything, as it is a complex task – we can help you identify the typical blind spots from many previous projects of this type.”
For more information, please read more on our business technology solutions and stay tuned for a new case study on our recent work with Neo Energy when purchasing an asset package from Total earlier this year.
Despite all the hard work of the last 25+ years, we still face enormous challenges in Oil & Gas data management. Recently, the focus has shifted from managing structured data repositories to the enormous volumes of ‘dark data’ that most of our clients still have in their digital vaults.
Gareth Smith, Head of Consulting at Sword Venture, explains that this shift is driven by the need to feed large volumes of high-quality data into analytics and data science-driven processes; growing regulatory pressure to report data; and the need to ensure safe, efficient operations (especially for the many assets that have swapped hands). However, most of this data is locked away in collections of documents and legacy proprietary formats, is nearly always poorly indexed and is often not machine readable. We have seen a surge in requests to help our clients do something about this.
Most of these Oil & Gas data management projects break down in to three core challenges:
Crucially, this has to be done without years of manual effort. It needs a different approach, using highly automated data science and analytics solutions to tackle these core challenges at scale and pace.
The first step is to build a pipeline that can ingest, store, process and extract/index data at scale. Unless our clients have the required compute resources to hand, we make use of the major public cloud platforms, e.g. AWS (Amazon Web Services) and Azure (by Microsoft), to provide the tools and processing power to tackle this challenge. A re-usable, cost-optimised and efficient data processing architecture based on cloud infrastructure reduces the costs and overhead to the client and allows us to move quickly through this stage of the process.
The next step is to clean up the data, understand what we have and identify the data that has some value to us. This is where we apply data science and analytics to automate and massively speed up what once was a laborious, labour intensive process. For example, we have developed a model using a machine learning algorithm, or neural network to predict the classification of a given document based on its text content, contained images and structure. Accuracy levels are often in excess of 95%. We combine this with analytics-driven clean-up and cloud-based optical character recognition (OCR) to create a repository of well structured, machine readable content ready for further analysis and processing.
The ‘Find’ and ‘Sort’ stages are just a means to an end; the goal is to extract value from data by putting it to work. We use a combination of machine and deep learning to identify and classify specific data within a document or text and extract that data in a machine readable format. For example, identifying deviation survey data (essential to all well interpretation) within documents and scanned images, extracting and making available to engineers.
We are working on the use of natural language processing (NLP) and machine learning techniques to draw meaning out of documents in order to generate greater insight. For example, to automatically recognise the findings and outcomes of final reports for thousands of wells, reducing the requirement for manual analysis.
Our goal going forward is to develop and deploy techniques such as deep learning and data engineering to synthesise large amounts of information and provide recommendations to assist human decision makers. For example, assisting with decisions on where to target exploration investment or how best to configure engineering parameters to reduce failure rates and maximise uptime. We recognise the value of combining data and knowledge to create unbiased predictive reasoning tools to support complex decision scenarios.
The combination of on-demand cloud computing and advanced data science and analytics techniques is revolutionising the way we manage data and extract value. We can tackle data at scale and pace, reduce manual effort and automate processes in a way that just wasn’t possible even a few years ago.
To find out more about how we can help you with your data management challenges, read on about our data and information management solutions here.