site banner

Friday Fun Thread for January 26, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

3
Jump in the discussion.

No email address required.

If anybody is interested, the shenanigans that I described almost exactly a year ago in my "Six Months in the Life of a Civil Engineer" AAQC have not gotten any less ridiculous.

  • The "unusually large" project mentioned in the previous post had a length of seven miles on two intersecting segments of two highways, as well as around 900 individual stripes. (The figure of 800 that I gave previously was erroneous.)

  • In contrast, the latest project assigned to my office has a length of fifty miles spread across ten non-intersecting segments of six highways, and has around 1600 individual stripes. (The new project has fewer stripes per mile because it's mostly on two-lane cross sections and includes no interchanges, while the "unusually large" project was mostly on six-lane cross sections and included several interchanges. In terms of lane-miles, it's an increase from 55 to 120.)

How did this happen? Supposedly (as relayed by my immediate boss), the bigwigs decided to merge multiple smaller projects into a single gigantic project, so that the paving contractors will bid lower unit prices on the larger quantity of thin-surface-treatment material. Decisions of this nature have been made before. An egregious example on which I worked:

  • A pavement-preservation project was originally slated to cover a single nine-mile-long segment of highway (reusing the topo files of an older resurfacing project that included the same segment).

  • Then, for reasons to which I was not privy at the time, the bigwigs decided to split it up into three three-mile projects prior to the start of design.

  • And, finally, after the three projects had already been designed by three different engineers and submitted for initial review, the bigwigs decided (again, for reasons to which I was not privy at the time) that they had to be glued back together into a single project in the brief period between initial review and final submission!

It was utterly mindboggling.

On the bright side:

  • It seems that four of the new project's ten segments have been resurfaced within the past few years, and we may be able to reuse the electronic files of those resurfacing projects. So "only" twenty miles of baselines/topo/stripes, rather than the entire fifty miles, will have to be drawn from as-builts and satellite photographs.

  • Supposedly, the topo drawing for this project will be done by my organization's drafters, rather than by us designers. How well the arrangement will actually work out remains to be seen.


On another bright side, the HTML/Javascript program that I mentioned in a comment that I was writing as a replacement for a clunky old Excel/VBA program has turned out to be surprisingly functional. The engineers of my office have successfully used it on multiple projects so far.

It's reasonably simple:

  • Manually measure all the quantities in your project's prop (proposed-work) files.

  • Using Excel, input the numbers into a bunch of CSV files with predefined columns (pay-item ID, pay-item name, sheet group, sheet number, quantity, unit, location on sheet, comment, etc.), arranging the pay items in and among the files however you want—e. g., one CSV file for each construction sheet, one CSV file for all the project-wide boilerplate items (like mobilization and steel price adjustment) that aren't associated with a sheet, one CSV file for all the project-wide environmental items (like silt fence and concrete washout) that aren't associated with a sheet, etc.

  • Open an HTML file on your computer and "upload" the CSV files into the HTML file. (The dataset is still in your computer's memory, not actually uploaded to an external server.)

  • Click a button to process the input.

  • Check the warning and error messages. (Did you input a quantity as zero or negative? Did you input a non-integer quantity for a pay item (other than permanent signs, which uniquely are allowed to be non-integer quantities of square feet)? Did you forget to add the federal project number or a "state-funded" placeholder? Did you forget to list your initials in the "calculator" column and your immediate boss's initials in the "reviewer" column? For a project-wide lump-sum pay item, did you input a quantity that is not 1 (100 percent)—or, if the pay item is split between multiple funding sources (I haven't personally seen any such projects, but the author of the old Excel/VBA program explicitly mentioned this as something that (1) does happen and (2) can't be handled by his program), quantities that do not sum to 1? Did you input pay items with the same ID but different names in two different places? Et cetera.)

  • Click another button to get in HTML format ("printable" directly to 22″×36″ PDF) the table of quantities that always constitutes sheet 2 of the project's sheet booklet.

  • Click a third button to get in HTML format ("printable" directly to 8.5″×11″ PDF) a list of all the quantities in the project sorted by ID and sheet, with location on the sheet noted (required by the Construction people).

In the semi-near future, this manual process may be partially or entirely superseded by automatic quantity-calculation functionality in the new CAD software that I mentioned in the same comment. (That functionality appears to be accomplished via VBA. Sigh.) But I still am quite surprised that a random civil engineer who has done a little programming in his spare time can whip up in just two or three months something that seems a lot more intelligible than the Excel/VBA program that's been used for twenty or thirty years by the designers of an entire reasonably-large transportation organization.


More people need to make lengthy posts about their cool jobs in the vein of my previous post! I've been waiting with bated breath for the past year to hear about the dreaded "scrum master", "daily stand-up", and "Git merge conflict" from some of the 10× programmers that supposedly frequent this website. Maybe we even have an architect who can complain about his clients' wishy-washiness and scoff at all the pathetic free (libre) attempts to compete with Chief Architect, or a paving contractor who can express his hatred of his local transportation authority's resident engineers and in-house designers in the strongest of terms.

Damn. It’s kind of comforting to know that bureaucratic train wrecks aren’t limited to the defense industry.

Um, this is the Friday fun thread, sir. I dunno about work stories.

I work in software consulting, and if I regaled everyone here with dumb shit my clients (and, occasionally, my consultants) do, there is a bottomless supply of "laugh so you don't cry".

I would read this. Pls write.

Doooo iiiiitttt.

Especially the consultants.

Edit: not because I have some weird vendetta against consultants, but because I find it more interesting when someone fucks up in a dumb way at something they're trained for.

I once worked (late 00’s) at an architectural plans copy/scanning shop where they manually renamed the scanned PDF files by opening them in Adobe Reader, closing the file, and renaming the file. Sometimes they didn’t rename them by page name (M3, E24, etc) and just renamed them 001, 002, 003, etc., which saved time, but since the scanner didn’t flip the pages, they were always last page first and people would lose count.

I scanned mine as TIFFs, opened them in Irfanview, and renamed them there. Then I did batch conversions to PDF. I was so fast. Since the files were sequential in reverse, I could rename them in backward order with batch renaming.

Since the other guys didn’t want to learn IrfanView, I set up a batch file to reverse-rename them. It worked, but the final file was always numbered 200 and the first was somewhere over number 1.

So I did what any reasonably smart programming-trained DOS-user would: I wrote a BASIC program to generate nearly identical reverse-renaming batch files for 5 to 500 pages, and also generated a 495-line “chooser” batch file to pick the right one based on if file 500 existed, file 499, 498, etc. Then I added it to the shell:sendto menu so all they’d have to do is right click the last file and send it to the renamer.

In my next job, also document scanning, I created all sorts of image conversion and PDF manipulation scripts, storing their shortcuts in shell:sendto. By the end of that job, my sendto menu was taller than the screen.

It’s always difficult for me to appreciate what others might find interesting in my daily tedium haha.

I can offer some tidbits though. I’m on a small team (<5 ICs plus manager) and our daily standups are scheduled for 15 minutes. They routinely go 1-2 hours. Every day. I blame a mixture of managers trying to catch up on what’s going on and some on the team just…..always having a lot to say.

Over my career I’ve landed on a kind of “uncertainty principle” for data where the more basic and essential to the business a piece of data is, the more unknowable.

Three examples:

I once worked at a company on a pricing competitiveness project, so I asked the following question: “here’s literally a customer id. What are we charging this person?” And I got 4 different answers. Each was labeled “production” and I was told each was correct.

Another role I once had involved my company’s network so here’s a question: “how much traffic is on the network, right now? Where is it going?”

We’re talking like 10-20% error in these measurements. It’s wild to me. I think this definitely plays a role in how short of its promise “data science” has fallen. It’s a marginal gain and for so many companies I’ve worked with there’s been much much lower hanging fruit.

Last example was an ISP trying to improve the efficiency of their technician house calls. Question: in how many of the visits last month did the tech resolve the problem as opposed to merely doing something that was pointless and telling the customer the problem was solved, and then the problem just went away? Who knows!

The catch to this one is that if the stat was knowable—if it was a solved problem—would it be your core business?

Not that it makes the continued failure to implode any less impressive, mind you.

But I still am quite surprised that a random civil engineer who has done a little programming in his spare time can whip up in just two or three months something that seems a lot more intelligible than the Excel/VBA program that's been used for twenty or thirty years

You seem young. I know of large, state level financial institution whose servers were running windows and a kludgy visual basic scripts on top of that. Well, accounting etc ran on as/400 but client data, stored in access database, processing with vb scripts.

Communicate between servers was handled by writing files into shared directories.

Okay, this was cca 2008, but still.