Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is stereotypically Amazon to spend tens- or hundreds-of-millions to develop a fully-integrated next generation truck and then try to save ten bucks by putting the slowest possible CPU behind the head unit / infotainment system.


If working in the modern software industry has taught me anything, it's that developers can figure out how to write slow janky UIs no matter how powerful the processor.

I'd go so far as to say there's an inverse correlation here. My 100Mhz desktop had a more responsive UI than any computer I've used in the last 5 years.

Text editing on that 100Mhz computer felt responsive, typing characters had them appear on the screen instantly, I could copy+paste 100s of lines of text without a problem. Now, I load notion.so on a 4Ghz cpu, and typing is sluggish, scrolling has severe lag, and pasting 100 lines of text sometimes takes several seconds.

All this is me saying that I'm more inclined to blame software than to blame the hardware for UI lag.


Text editing on that 100Mhz computer felt responsive...

Of course it did, because you were only editing the text. If you switch off syntax highlighting, linting, autocomplete, type checking, git integration, spell checking, grammar checking, and everything else your editor is doing then you can experience the joy of fast typing again.

A 4GHz CPU is only 40 times faster than a 100MHz CPU (not that clock speed is the important bit here, but whatever), but you're asking it to do probably about 20,000 times more computation every time you press a key. And then you complain that it's too slow!


Emacs on my first computer, an i486 with a whopping 33 MHz of compute power had no trouble with syntax highlighting, autocomplete and spell checking.

The problem with slowness is in the design, not inproved functionality. The root cause is that modern software runs a ton of checks for every keystroke, tries to talk to the mothership every second and is designed to advertise and upsell instead of solving actual user problems.


TBF, Emacs/vim today with pretty much every feature turned on also don't feel sluggish.

I think the 2 issues hitting modern text editors are they are FAR too synchronous (they are doing the syntax highlighting, fonts, etc, with every keystroke rather than in the background).

And rendering has gotten WAY too nuts. Seems like all modern text editors are full blown web browsers, usually so they can be easily cross platform.


There's not an option to switch various features off in many contexts.

It is a requirement imposed upon me that I write documents in notion, and that text editor doesn't let me import locally written files since it doesn't have any lossless textual representation. I can't avoid the laggy text editor. It's a requirement that I use slack, and no combination of settings for the browser or desktop client seems to stop massive input lag.

Typing into iOS or android's keyboard, in any text box, is an experience with very noticeable lag.

I know that the computer is doing thousands of times more things than my older computer did, but I don't want it to do that stuff and there's no way to turn it off while still participating in modern society (i.e. using a cell phone and working at a company).


I don't know if you've tried this, and it might very well not be easier per se, but you can separate writing your text from the "typesetting" by composing in a barebones editor and then pasting that text into wherever you need to display the text. I don't know if something as barebones as notepad exists for phones, but the method I've described might make your experience better on a desktop.


A modern 4GHz CPU is not only 40 times faster. It is a few thousand times faster than a 100MHz CPU back from the days. Probably not 20,000, but at least 2,000 times faster seems reasonable.

And responsiveness back then was so good, because your program was very close to hardware with very little in between if not running completely free from OS abstractions.


Can you show your working on this? Because a 100MHz CPU can do 100,000,000 things a second, and a 4GHz CPU can do 4,000,000,000 things a second, and if my math's right, that means the 4GHz CPU can do 40 times as many things a second at the 100MHz CPU.

Now, you might argue 'the 4GHz CPU is multicore!', and so sure, maybe we're up to 8 times 40, which is, I'm pretty sure, 320. And maybe you'll say that the cache is bigger, so you'll be able to keep the data pipelines full and get more done on the faster CPU. But how are you getting to 'at least 2,000'?


Sure. I'll oversimplify a lot, but the feeling of how things work should be correct.

The clock frequency is not a good way of measuring performance. Never was. Even earlier designs as the 8086 did not do one thing (instruction) every cycle. They did far less.

Modern CPUs are extremely complex beasts that can take in a lot of instructions. They take a good look on those instructions, change them in a way that does not alter the result but makes some optimizations possible and then distribute those instruction to a bunch of internal workers that can work on those at the same time. More on this can be found in the wikipedia rabbit hole starting with instruction level parallelism.

One way to measure this is to look at how many of a selected set of instructions per cycle can be done. An 8086 could do 0.066. A 386DX did 0.134, a 486 could do 0.7. A Pentium 100 already could do 1.88, and so on. Modern CPUs get to 10, per core.

But wait, there's more. This comparison gives only a very rough idea of a CPUs capabilities since it focuses on a very specific thing that might have little to do with actual observed performance. Especially since modern CPUs have extremely specialized instructions that can do enormous amount of computations on enormous amounts of data in little time. And there we are in the wonderful world of benchmarks that may or may not reflect reality by measuring execution time of a defined workload.

Passmark does CPU benchmarks and their weakest CPU in the database seems to be a Pentium 4 @ 1.3GHz. Single Core, single thread. It comes in at 77 (passmarks?). An i7-13700 is rated with 34,431. Does that make it 500 times faster than the 1.3GHz P4? Hard to tell, but its a hell of a difference. And from the P4 to a Pentium or even a 486 running at 100Hz ... at least another hell of a difference.

We can also try Dhrystone MIPS, another benchmark. Wikipedia has - strangely enough - numbers for the Pentium and the 486 at 100MHz: 188 MIPS for the Pentium, 70 MIPS for the 486. The most modern (2019!) desktop cpu entry comes in around 750,000 MIPS. A Threadripper from 2020 over 2,300,000 MIPS.

So, how much more can a modern CPU do than an ancient one? A lot. And especially a lot more than you would expect from the faster frequency alone. Even with only one core, it can do several hundred times the workload. And we got a lot of cores.


While it's harder to calculate, that 4Ghz CPU comes with vastly faster RAM, busses, and disk. Not many 100 MHz systems around with NVMe or even SATA...


>your program was very close to hardware with very little in between if not running completely free from OS abstractions

This! It also meant that it was very very easy for any program or misbehaving driver to completely crash your system. Not to mention all the security implications of every app having direct hardware access.


But when I go look at my text editor being slow, I can see that the amount of CPU time spent dealing with the kernel is less than a tenth of it. So that's not the reason.


That's not how latency or responsiveness works.

Dan Luu did cool experiments with input lag (https://danluu.com/input-lag/).


It's a much better estimate than hand waving about memory isolation.

If we want to talk about how things work directly, my program can get things to the GPU in far less than a millisecond. The safety layers are not the problem.


No excuse for any of those things to slow down the actual typing.

And a lot of those computers did do fancy checking, and a modern CPU can do ten times the instructions per clock on top of having 6-8 cores.


40 times faster seems fast on paper, but honestly I would love a CPU 4 million times faster.


> you're asking it to do probably about 20,000 times more computation every time you press a key

If hinting, spellchecker and autocompletion takes so much computation you have terrible tooling. This kind of functions existed 25 years ago and they were real-time.


> but you're asking it to do probably about 20,000 times more computation every time you press a key.

[citation needed]


Here's a table with the input lag and release date of various computers: https://danluu.com/input-lag/

It seems like there might be a slight negative correlation (newer is worse). The data is fairly all over the place though.


I seem to remember that a good pencil has lag on the order of 1ms (how long it takes the tip to settle out after high speed motion), but I don't remember where this number came from, so it's untrustworthy.

I always view the IBM selectric as "beyond the horizon" in terms of responsiveness: that's 25–30ms.


> I seem to remember that a good pencil has lag on the order of 1ms (how long it takes the tip to settle out after high speed motion), but I don't remember where this number came from, so it's untrustworthy.

Not sure what lag there is on actual pencil, but there's this somewhat famous video[0] from Microsoft Research, where they test various values of input lag on a touch screen, and demonstrate that even 10ms is noticeable - but 1ms is about enough for it to feel instant. With 1ms lag, drawing on a tablet seems to feel just like drawing on paper (I personally haven't had an opportunity to test a tablet with 1ms lag, so I can't tell).

--

[0] - https://www.youtube.com/watch?v=vOvQCPLkPt4


...until you start comparing lag and CPU performance. Then it becomes obvious what kind of bloated software the world is producing.


And then you notice that recently there was an article pointing out that "software is IO-bound anyway" is no longer true, and this claim - for the first time since I remember - wasn't laughed out of the room. I await a more detailed study, but it seems quite likely the article was correct: IO is quite fast these days, and software is likely to be CPU-bound, particularly on parsing stuff as it's being read and operated on.


Wasn't the limiting factor RAM latency? Anyway, yes, computers parse way too much JSON, SQL, and whatever else. I've recently started working with gRPC, and boy is it nice. The current paradigm where every action requires a text blob to be written, compressed, then sent to a server that decompresses it, parses it, acts on it, then creates a new text blob that is compressed and sent out... It's a bit much. And that's before ypu add load balancers and microservices. gRPC cuts down on the parsing and compressing bits, but it's still quite wasteful.


I'm going on record to suggest that people who have used computers from the 80's and 90's have rose tinted glasses on when remembering their experiences. Nostalgia is a bitch and lies to us all the time. Just loading a fucking program from a spinning disk hard drive adds a significant amount of startup time to programs that we've completely imagined away from the "golden age" of our computer usage. At this point I'm convinced it's just a natural extension of "they don't make it like they used to" which has plagued mankind for basically every generation ever which completely ignores survivor bias. I distinctly remember experiencing *major* performance gains when moving to more memory than my system and apps needed to use and the move to solid state drives. None of the "programmer's don't know how to code today" nonsense even comes close to eclipsing the performance gains from those two changes.


DOS programs loaded fast, since there was around 1MB of working memory. Granted they did not do much, but where responsive, except when needed to do real compute, then you just waited forever.

Windows 3x/9x with first real multitasking and swap - things got quite sluggish. Early browsers were particularly eager of gulping ram and bringing everything to halt.

Nowdays it feels somewhere in the middle with copious amounts of ram stuff can be fast once up and running, but it seems every app either wants to load from web on every step or tries to index your drive on every keypress.


Part of it is rose-tinted glasses, sure, but it's not entirely rose-tinted glasses.

The "Living Computer Museum" in Seattle was (up until covid closed it) an excellent way to experience the past.

I had this same insight and feeling then. The old machines, running their old operating system versions, felt responsive and crisp compared to performing similar tasks on modern machines with modern software equivalents.


Both can be true at the same time. You can have a very slow program startup time but once it is up, it's very responsive to inputs.


So it's funny you mention that. I own a couple of decidedly retro machines from the 90s, one runs DOS and the other runs Windows 98. These are used both for gaming and for productivity, both are equipped with solid-state drives as of late (I still have the original spinning rust.)

Both of these machines are many times more responsive than any thing put out in the last decade despite being thousands of times less powerful. The applications they run are made to serve my needs as a user 1st. It gets out of my way. The user interface is clearly designed for the mouse and keyboard I am obviously using. It is made to help me accomplish tasks more efficiently rather than stroking some designer's ego or chasing some fad. Most of the software I use was released "done" rather than released half-baked with the hope of future updates. I don't have to worry about having my privacy invaded.

It's not rose tinted glasses when I'm not wearing any and can look behind me and see the color. Modern mass-market technology is "worse is better" writ large.


Yes and no. I am now running an equivalent of a computer power that was once reserved for some dedicated government agency in early 90s[1]. I accept that some of that power is put to good use, but I do see wasted power in OS [2] and even games [3] on a semi-regular basis.

For the record, "they don't make em' like they used to" is an absolutely valid complaint. Note that now even with SSD, ridiculous memory and cpu, websites still manage to stutter ( but I accept online is its own animal ). Part of me wants to go over Windows releases as an example of resource use across generations ( and how they leaped ).

<<I distinctly remember experiencing major performance gains when moving to more memory than my system and apps needed to use and the move to solid state drives.

But do you still see it or is that performance assumed now ( making lazy design decisions easier )?

[1]https://www.reddit.com/r/Amd/comments/kdr3p6/how_would_amds_... [2]https://www.youtube.com/watch?v=hxM8QmyZXtg [3]https://arstechnica.com/gaming/2021/03/hacker-reduces-gta-on...


Some things were better, some things worse. For example, I hated the thousand pound CRT monitors and love my thin LED hi res.

I also hated floppies. Squeek, squeek, grind, grind.

The old hard drives were fine, though.


But you are forgetting that you used to be able to smell your computer.


It's just not monolithic. We've made serious improvements in a number of areas, but there are a few areas where we've clearly stepped backwards. Neither opinion really contradicts the other.


Try a text editor that isn’t built on top of Electron, perhaps.


That's kind of the point isn't it?

Personally, I would be ecstatic to ditch VSCode for some super fast native code thingy, but the truth is that it's often just a better experience. The ecosystem is big, and the performance is not all bad news (the ripgrep-based search is pretty damn impressive, and beyond that, it feels like VSCode has less issues with performance cliffs in normal use cases that I've encountered than a lot of older editors, including KDE's Kwrite and GNOME's Gedit, but even other Electron editors like Atom, too. It's not perfect, but there are times when I left Kate to VSCode because opening some file made Kate way too unresponsive.)


I just moved from VSCode to RubyMine for my Rails work because I was finding the Ruby plug-in ecosystem for VSCode to be too flaky. Performance in both have been fine for me, despite being built in JS and Java.

All that said, I really do miss TextMate.


OK, I will give you one thing: I do love Jetbrains IDEs. That said, I don't use them for everything. They use a lot of RAM for me when loading large projects, and they feel a bit awkward outside of "traditional" project structures that work well with VS Code.

I know Jetbrains is making a text editor, too. It does look very nice, but I have yet to try it.

This all being said, Jetbrains IDEs do still feel less responsive than more native stuff like Kwrite or Notepad++. It can be forgiven of course, but I think it's a valid point to note down, especially given how often people bring up that article comparing latency of desktop computers over time.


> All that said, I really do miss TextMate.

It's still there, no? https://macromates.com/


It is there, but seems to be essentially abandonware. The ‘mate’ command line tool to launch the app fails more than half the time on Apple Silicon, the plugins and bundles that exist are years out of date for Ruby and Rails, and I don’t expect it’ll ever get support for LSP.

So, sure, TextMate still exists. It’s still my default scratchpad for modifying text, and I even use it for some light scripting from time to time, but I don’t actually use it to build and maintain complex software any longer.


Looks like some people tried to port it to Tauri/Rust. I'm sold on VSC personally. Got my configs all setup. Runs on a phone (Mobian) ha.

Edit: I'm aware there's the browser one now built into GitHub.


100MHz, so to be clear, we're talking about like an early Pentium. 1995ish.

So that 100MHz computer had maybe 8 or 16MB of RAM - it was struggling to run anything other than the text editor you were using - no, you aren't running an MP3 player in the background, winamp has not been released yet; no you are not running your email client in the background, you're on a 28.8K dialup modem and you are not hanging on the line on the offchance that mail arrives - you are dialing up, waiting 30 seconds for the modem to handshake, then watching your emails trickle in at a rate of one every few seconds.

The machine took noticeable time to seek and load or save to the hard disk, so it made up for that by trying to keep everything in RAM all the time, with the downside that if it crashed (and it frequently crashed) you would lose all your work since you last hit save.

So yes, the cursor responded right away when you pressed a key. But other than that, golden days, those were not.


The microcomputers I used in the 1980s, running on ~1MHz CPUs, were more responsive than everything that came after.


The text editor on my DOS machine would load instantly. It was also only 56K in size.


It still would, if you were willing to use a text editor in DOS.


Actually, the same editor recompiled for modern machines also loads instantly!

https://github.com/DigitalMars/me


Edlin was 2K and for a year or two was my main editor.


I wasn't willing to go that far!


Its whoever wins, forces his policies on the other department. If controlls team buys the cheapest hardware, the force the dev team to be competent economic wise, else the software team can hire jsunior and get away with it.


A thousand times this. I die a thousand times a day waiting for software.


I've always used vim for text editing (and programming languages that do not use JVM), never had a problem with responsiveness regardless of computer spec.


Exactly. Hardware was fast enough in 1969 to put men on the moon. In 2022 we have laggy text editors. That's not a hardware problem.


But that 1969 hardware did that one exact thing and nothing but. No networking, no displays beyond some digits, etc. It was a calculator. No one's calculator is running that large text editor, 20 Chrome tabs, Teams/Slack, Spotify... Yeah, nothing should lag these days but it's not really a valid comparison between today and the Apollo computer.


It's pretty clear whoever designed this toy has never driven an actual cargo van or done any blue collar work in their life. That Tesla-style touchscreen for starters. I give it 3-6 weeks before it's shattered or fails. The look on the guy's face was priceless when he couldn't open the cargo door without walking over to the touchscreen and tapping an icon. As opposed to having a simple button on the door itself. Many late model cargo vans do share crappy infotainment systems with their passenger counterparts, but almost never use them for core or upfitter functions, just the radio.

Have you ever seen the inside of a Ford Transit van? They're remarkably pedestrian. Or a UPS truck, made from steel and brawn and not much else. These things need to be engineered to take a beating. Yet this has push-button start with a fob. (Keys are a much better choice for fleet vehicles, if not only for key management, they withstand abuse in the field much better than fobs, which are expensive and difficult to replace.)

What they should have installed instead is a cup holder for the sports drink urine bottle. Which you keep next to the cigarettes and burrito wrappers, to the left of the shitty basic FM radio with real knobs.

How are you supposed to use a touchscreen for basic functions whilst wearing gloves? This thing was designed by office-dwellers.

All they need is for Ford or Benz to turn their bare bones cargo vans electric and they'll be turning these Rivians into beer cans.


Whoever decided to introduce touchscreens inside cars and vehicles should be chastised. Touchscreens inside a vehicle are akin to texting while driving for me.


Yep. I've passed on this generation of fully electric vehicles because none of them offer physical buttons to operate the climate control.

Automatic climate controls don't cut it--change directions such that the sun starts beating down on me and I'm going to need to turn up the fans beyond what automated systems would choose. Test driving the Tesla Model S, on two separate occasions while attempting to set the climate controls I almost got into a wreck--it not only is only on the touchscreen, but also buried under something like 5 menus. WTF? Volvo XC-40 and Mustang also have no climate control buttons. In my area, all the other electric cars have a waiting list longer than a year. I want to buy electric, but safety is paramount--driving a car is far and away the most dangerous thing I do on a daily basis and I'd like it to be as safe as possible. I ended up with a Subaru this time.

Hopefully someone will make an electric car with physical buttons for the climate controls.


> electric car with physical buttons for the climate controls.

I drive a VW e-up! for the same reasons. It is a simple city car with analog controls and it's perfect for the Berlin city.


F150 Lightning (base model) has non-touchscreen UI.


The new Lyric has very nice tactile controls.


> Touchscreens inside a vehicle are akin to texting while driving for me.

No it is quantifiably worse than texting. In fact it is worse than drinking and driving. [0]

[0] https://www.trl.co.uk/publications/interacting-with-android-...


> How are you supposed to use a touchscreen for basic functions whilst wearing gloves?

I don't know about this one, but some car touch screens can be operated with gloves just fine.

(They still suck in comparison to actual buttons and knobs, though)


All they need is for Ford or Benz to turn their bare bones cargo vans electric

You mean like this?

https://www.ford.com/commercial-trucks/e-transit/

https://www.mercedes-benz.co.uk/vans/en/electric-vans


Yes exactly like that.

You'll notice the Ford Transit EV still retains the traditional mechanical key, and also features a simple locking glove box that works off the same key.

But why do that when you could make the glove box electronically locking via bluetooth proximity over the infotainment.

People have deluded themselves into believing that EV's must be packed with superfluous electronics and other futuristic garbage, whilst it's just the drivetrain that's different.


These vans have awful range.

I have an 18 year old VW T5 Transporter with light camper fitout, which will do 900km (550 miles) on a tank.

The Ford linked above claims 200km (120 miles) between charges, the Merc 95 miles.

These vans won't replace my can fory usecase, not yet, and not at that price.

They'd be reasonably for a delivery route I suppose.


Yeah, they're not targeting your use case. The vast majority of these vans, I suspect, are used for urban delivery or trades roles, where 200km is fine. 900km range vans are perfectly possible, but will be very expensive and most van users won't need them.

EDIT: Also, in many countries you can drive a van with a normal driver's license provided it's under, usually, 3.5 tonnes. Once you go over that you need a special license. 900kms worth of battery might make it difficult to fit a reasonable payload under the legal limit.


If you can charge at home/base then, really, you only care about the range being sufficient for one day. I think 200km is enough for many, if not most, use cases.

Most of the Transits I see (in the UK) are not used to drive long distances in a day but to carry stuff around. Think electricians, plumbers, all those trades.

Even an Amazon delivery driver may not drive more than 200km a day. What they do is plenty of stops and I suspect it takes them hours to drive just 20km.


I think plugin hybrids are potentially great for the transition to full EV for people who need to drive long distances.

There are many people who never drive more than 100 miles a day. Why do they need a car with 200, 300 mile range? Not to mention we'll be investing in charging stations along the Interstates, so charge scarcity will be less of a problem.

Fleet vehicles are the perfect first adopters of EVs - they have been using alternative fuel sources for decades in the form of natural gas and LNG. They have short daily range, they go back to a central depot each shift.

It isn't awful range any more than a Mini has awful towing capacity for the person who never tows.


There's going to be a market for both.

The ones with less range can be smaller, lighter, and so less expensive. If all you need is 120 miles, there it is.

Someone will make one with a bigger battery.


> The Ford linked above claims 200km (120 miles) between charges, the Merc 95 miles.

200km is more than enough for a typical parcel delivery day.


Thankfully this article is about trucks :-p


I rented an electric VW Transporter recently (think it's the same van as the Mercedes Sprinter), and it really felt like the future. It was just so nice to operate. Smooth, quiet, tons of torque, and will probably beating day after day with little to no servicing. Range wasn't massive, but for your typical daily urban delivery round that doesn't matter so much. It just made so much sense.


>How are you supposed to use a touchscreen for basic functions whilst wearing gloves?

Ah there are touchscreen-friendly gloves now. Amazon has plenty in stock!

Delivery drivers already have to deal with that issue because everything is on smartphone/tablet.


Every set of touchscreen-friendly gloves I’ve tried stops working after a couple of goes in the washing machine.


There's a comment on the video from someone that has, though no mention of the touchscreen specifically.

"I used to deliver out here in the Bay Area. This Van has SO MANY improvements over the normal Transits and Fluid Vans. I absolutely LOVE the removal of the side door for the new passenger door space. It actually makes the most sense ever as someone who actually delivered. The spacing on the shelves is questionable to me however.... They are DEFINATELY less wide, the Transits could fit a tote on them with maybe a few inches of overhang."


^^ This 10000x.

Sure there are some nice comforts and features that will make life easier for the operator, but on the whole it's a massive step backwards in actual durability, utility and usability.


> All they need is for Ford or Benz to turn their bare bones cargo vans electric

These are already a thing, at least in Europe. Ford only released theirs this year and the Mercedes ones are next year, I think, but Peugeot and Renault have had various electric vans out for a while; you see them around a good bit in Dublin.


People are programmed to recite "Like, Subscribe and Share". How are they supposed to do that in a bare bones reality?


Using a more powerful computer won't really change development costs all that much; but it could seriously affect BOM costs. Amazon wants to buy absurd numbers of this, so it's not unreasonable that they've tried to heavily cut costs in the BOM.

(Also what the other commenter said.)


I think that's a false economy.

Suppose a better infotainment system saves even one minute of time per day. Over a 300 day work-year, that's five hours of driver-work, valued on the order of $100 (at $20/hr). All other things being equal, $100 would go a long way in an infotainment BOM.

If this isn't being done for cost reasons, I expect it might be a result of bulk ordering. Rivian isn't a large car manufacturer, so it might not be ordering custom equipment from the manufacturers. If they have to compromise by taking an infotainment package wholesale from existing mass-production lines, then maybe there's no "$100-equivalent" upgrade on the market.


Maybe developer workstations should be less powerful than production machines ;)


In every movie production edit bay will be a $100 Walmart flatscreen. Because making a video production look good in 4k ProRes is fairly easy. But the majority of your audience is going to watch it on that cheap screen or something similar.


I Concur.

Especially UI developers should not have retina displays or 3x 43" 4K displays when developing web page layouts


They can have all of that for their editor and photoshop and whatever. However their actual preview monitor should be smallest cheapest monitor they can find.

Back when I was writing some quite heavy 3D software we would always test each release on a machine that was slightly below our advertised minimum spec to make sure it was at least usable there.


This fully explains my time working at a startup acquired by Amazon. We were provided the lowest spec macbook pro to the extent that lead engineers were literally waiting around for hours every day waiting for builds to complete. Some couldn't even use two monitors due to display output limitations.

Anyone who has worked for amazon is fully aware of how the company internals being literally held together with duct tape.


Amazon's religion is measurement. This is a return to Taylorism, "Scientific Management," and the Bedaux System. But they won't see the problem with that unless they try to measure the efficacy of measuring everything.


Even the R1T had initial impressions that the infotainment wasn't exactly super optimized. I think it's been improved but there's a reason Tesla can get away with such a barebones and average quality interior.


Never mind that. What the hell is going on with the in truck efficiency and linking stops with app state? They could save so much “paper pushing” that drivers have to do if they just utilized organized cargo areas and GPS.


Gotta have something for your promo doc on frugality if you’re going for L7


I blame bean counters for that kind of nonsense




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: