Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I work/consult for a seismic processing company. 130TB is a drop in the proverbial bucket. Our small shop has multiple petabytes of data in surveys/well bore logs.

If you were looking to modernize an industry O&G is super fit for disruption. There are really only two major players who have awful legacy software. We spent $300k last year to aquire a single seat on a piece of software. We spent another $200k on 2 seats for a year of another piece of software.

These applications are total garbage too.



> If you were looking to modernize an industry O&G is super fit for disruption.

The only way to "modernize" Oil and Gas that is compatible with a future for the planet is to shut it down.


Perhaps for the purposes of energy. There are mydriad petrochem products that have no viable alternative.


No commercially viable alternative when the externalities aren't priced correctly. But hydrocarbons are already made from coal:

https://en.wikipedia.org/wiki/Fischer%E2%80%93Tropsch_proces...

The same process can be used with biochar, or potentially carbon from some future carbon capture method.


This is true and exactly why we shouldn't burn these valuable resources.


Along with all agriculture, medicine, transport, etc.

"Shut it down!" doesn't apply so well to the food supply right?


What‘s you point? If you cannot fix everything, fix nothing instead?


I think their point was shutting down oil and gas is de facto shutting down all of those other fields.


Almost everything you buy has an O&G component to it. With the economic development around the world, at this point it is either O&G or worse alternatives (more costlier and sometimes dirtier)

How do you think goods are transported? How do you think the material is processed?

I agree that humanity needs to find better alternatives, but we all have to be realistic.


> I work/consult for a seismic processing company. 130TB is a drop in the proverbial bucket. Our small shop has multiple petabytes of data in surveys/well bore logs.

The quality of the data is important. Potentially there is a lot of value here, especially for academia who might not have access to this kind of data. You may have multiple Petabytes, but how much of that are you giving away?

I would also argue that 130TB is on the edge of what you can feasibly transfer and store without requiring some kind of complex setup. When you get into Petabytes you're really having to design a unique system just to store and access this data.


When I worked at CGG long ago, we worked with a lot of the proprietary E&P analysis software - it was primarily old UNIX but surprisingly quite capable and scalable.

I think there’s a lot of unknowns in terms of capabilities and algorithms to go after that in that market.

I’d thought of going back into it and developing some front end visualization software - but the amount of secrecy and magic sauce put me off.


One contractor I worked with took their original software designed to run on IBM TSO/JES2 type mainframes and added a rough "GUI" to it. The parameterization of the modules was identical but instead of entering everything into columns and rows like a set of digital punchcards the user could simply fill out a field and the new software would insert that information into the correct row-column. Then they rewrote all of it from the top with a genuine GUI and sweet graphics and we all laughed when the only thing that worked at their initial demo to the processing groups was a band-pass filter. Minor problem for them but a real hoot for us.


What are the two largest applications? What are the applications providing? Are they doing analysis or just providing database interface to all of the survey data?


Compared to Volve this is still far better. I'm just worried it'll be all seismics, RMS projects, and .segy files. We're working on a solution for ingestion of well reports/logs with the Volve reservoir but have precious little examples from the Volve field itself. Here's to hoping this dataset is better!


> There are really only two major players who have awful legacy software. We spent $300k last year to aquire a single seat on a piece of software. We spent another $200k on 2 seats for a year of another piece of software.

What? There many more than two major players and every large contractor has written their own internal processing software. The best tools in their packages are reserved for internal use only, just as the majors did decades ago when the majors operated their own acquisition crews and had in-house processing staff.

Multi-nationals would do turkey-shoots to put new data in the hands of multiple contractors and let them have at it with their best processors and tools partly to see if processing shops had developed better tools but also to target talented processors for their own operations. It's pretty cut-throat out there.

If your company spent $300k on a single seat I would love to know what software they licensed. As an independent contractor for a couple of decades I have been able to license top software packages from top-line processing software companies for under $100k per seat. After you buy that seat you are only paying maintenance in successive years so your costs usually drop to around 20% of the cost of a new license. It covers patches and maintenance and entitles you to new versions on upgrade as long as you are current on your license. You can get a second-tier package for less than $60k + 20% annual maintenance. Some brand new packages I evaluated in the last 5 years debuted below $40k for a package that was full-featured and ready to go from field tapes to final deliverables. Your people must not do any evaluation at all.

The seismic software field is constantly changing.

>These applications are total garbage too.

Haha. I have seen a lot of this in my time in the industry. One thing that chaps most of our asses is that the larger software companies are tuned to the needs of those who hold the most software licenses so small shops are frequently ignored if they request new features, bug fixes, etc. in favor of the software provider adding some new whiz-bang feature for a large license holder.

A lot of the software packages available share the same roots. Several packages that I have personally evaluated are derived from one single public code base with the only real difference being their GUI. One may be more user-friendly, another sucks to have to deal with it but it has all the tools plus some custom gizmorithms, a third is almost a clone of the first but leaves out modules most useful for land or marine data and doesn't allow VSP processing.

I know they are the same code base because I cornered the developers when I noticed errors common to all of them. I had documented a persistent bug (hey, now it's a feature!) in their software while also evaluating other provider's software and in the process found two other packages that would produce the exact same output every time given identical input even though the output was clearly wrong. In one of the packages, even the parameterization screen dialogs were nearly identical. Pretty unimaginative GUI coders for some of this stuff but it is likely because the guys slapping the interface on this kludge don't actually understand the objectives behind what we are doing, they don't understand which bits of information make or break the imaging effort, etc. because they are coders and not geophysicists.

But, overall, large contractors employ some of the brightest minds in geophysics, computer science, physics, mathematics, geology, etc. That is the main reason that we are able to squeeze old data to extract even more geology than we initially could when it was acquired. Algorithms have improved, hardware is up to the task of keeping everything straight as it gets hammered through the flows.


For seismic work as far as I can tell almost everyone is using just Petrel and Paradigm as their core day to day tools. H&R has some stuff but really if you look at people working in the field you'll see Petrel experience is the most common trait.

Yes every shop has internal tools but again much of this is stuff cobbled together by non-software developers. I remember at one point a major tool at this one shop stopped working. It turns out they had hard coded a network share as a temp folder and the folder got removed.

Out $300k spend was on a completely loaded out license from a company that rhymes with lumberge. I don't want to put the name since there was a pretty strict NDA infront.


Thanks for the additional information. I believe you were probably listing prices for interpretation software packages like Petrel. Processing software is an entirely different animal and that is what I was talking about in my reply. You have to process the data so that it can be interpreted.

Your big packages for interpretation are Petrel, Paradigm's Epos systems, and IHS Kingdom-SMT. They are all great packages but as you noted there are things about each one that the user will stumble into which end up making no sense and likely result from poorly coded features which should have been upgrades but ended up being kludges.

> rhymes with lumberge

Scumbagger? As a former employee I can tell you that the best day of my life outside of my marriage and the births of my kids was the day I opened my mail to find an offer letter from another company - which I promptly accepted. Scumbagger bought a great processing software provider a few years ago. Their software was full-featured and very user-friendly. It had some quirks and a lot of kludges but their software support was top-notch the best I have seen in the industry. After the buyout, the older support hands were laid off and the support was bureaucratized to the point where it was no longer worth it to report a bug or request support. A true shame. Once a company gets that large they become like that old saying about juggies on the field crews - if they can't fuck it up they shit on it.


second the 'bright minds' contractors .. physics-trained coder at US top-5 university, contracted to do 3D modelling of sound-based exploratory data, with accurately modeled earth materials, in the software. So the strata of the earth materials for a real place, on the one side, and the behavior of the sensors on the other. This was in the 2007 era


I hope this position was a great stepping stone in your career and helped you reach your goals. Complex imaging problems require the skills of so many people trained in different disciplines.


It would take a ton of cash to disrupt O&G. Hundreds of millions of $. In a notoriously volatile market that may or may not be dying / declining.


You're not completely wrong but O&G don't really hire all-star software developers. Most of the software is complete crudware with asp/dated mssql back ends and just buckets of bad and legacy code.

I've also seen a lot of geo/petro phsycists/processors intentionally not want to help because they know heavy automation endangers their job.


Is this really the case? Can you give us some use cases where software could increase productivity in o&g that isn't being done already?

One could argue that existing software even if legacy works and gets the job done so what is to (significantly) gain from writing new (and improved) software?


I think the biggest opportunity in this industry is automation of the survey process (as a whole). It currently requires big expensive ships with big expensive sensors with lots of manual data processing from people getting reasonable wages because they have to stare at a computer screen in some fairly nauseating conditions. The big ships tend to send companies broke fairly quickly when there is a downturn in the market (i.e. no exploration). An autonomous survey vehicle has the potential to massively reduce the survey costs (as long as it was reliable).


> lots of manual data processing from people getting reasonable wages because they have to stare at a computer screen in some fairly nauseating conditions.

Most positions I've seen do not look like they pay that well. You can make more as a processor in a shop on land and work a regular 8-5 job. On the boats you get a 12-hour tour for the duration and the only real perk that might make it worthwhile is the opportunity to visit foreign ports and dawdle during breaks.

Survey automation is a complex task because depending on whether you are a marine crew or a land crew. Some things are easier in marine work due to less cultural constraints (buildings, highways, pipelines, etc.) But in the same way it is easier on land to locate and replace any sensor that fails without losing much data from that receiver location. For best imaging you need to be able to avoid introducing holes in your data coverage and correct anything that causes a data loss. Redundancy is a real thing out there.

The industry has morphed into one where many larger acquisition contractors have divested themselves of the ships needed to acquire the surveys and they contract that now to custom acquisition crews. Everything went bare-bones a rawhides in the last downturn and as we know, seismic exploration is one of the last things to recover after a bust.


From what I've seen is all the survey shops are just bleeding money because A. Surves equipment has a huge and expensive monthly cost. B. Most supermajors aren't ordering new surveys like they used to, instead they're choosing to have old data re-proccessed.


Acquisition is always the last thing to recover after a bust. There is so much legacy data around for reprocessing that all they need do is find someone with data in their prospective area and have it reprocessed using the latest imaging tools. A lot cheaper than acquiring new data.

Also, there is a shift in the industry from ownership of the survey equipment (sensors, recording systems, etc.) to rental of everything. Manufacturers build it all, rent it out for custom surveys, maintain it and service it all, train the equipment operators, etc. That cuts costs and makes acquisition a matter of retaining trained personnel for key positions and recruiting trainable people for the rest.


Most of the advancement that leads to actual profit increase is in analytics right now. Getting more out of data before investing heavily.


Labor is one thing but I'm really talking about how expensive it would be to get access to dozens of live rigs and tons of data.


At those prices, how big is that total market? Tens of millions?


A number I saw a few years ago, is that the just the plugin market for Petrel is estimated at 4.5 billion USD annually.

Unfortunately I have no way to verify the source.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: