Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:

lwn posted:

He also, however, noted that he wasn't entirely happy with the "folio" name, thus touching off one of the more predictable dynamics of kernel-community discussions: when the technical side gets involved and intractable, it must be time to have an extended debate on naming. So David Howells suggested "sheaf" or "ream". Torvalds favored something more directly descriptive, like "head_page". Ted Ts'o thought "mempages" would work, or maybe "pageset". Nicholas Piggin favored "cluster" or "superpage". Given the discussion, Vlastimil Babka concluded that the only suitable name was "pageshed".

lmao

Adbot
ADBOT LOVES YOU

RocketLunatic
May 6, 2005
i love lamp.
pageshed would be a good band name

yummycheese
Mar 28, 2004

I use huge pages regularly for database software and having to talk about it out loud always seems silly.

my memory pages. they’re freaking huge

psiox
Oct 15, 2001

Babylon 5 Street Team
whoops, i dropped my huge pages i use for my magnum address space

Sapozhnik
Jan 2, 2005

Nap Ghost
huge pages and this compound page thing they're talking about on lkml are completely different things

shoeberto
Jun 13, 2020

which way to the MACHINES?

psiox posted:

whoops, i dropped my huge pages i use for my magnum address space

Rufus Ping
Dec 27, 2006





I'm a Friend of Rodney Nano

psiox posted:

whoops, i dropped my huge pages i use for my magnum address space

lol

matti
Mar 31, 2019

i do not want to configure an irc client to ask elsewhere so i will do it here: is there already something that does this before i reinvent a wheel: https://pastebin.com/pg0jJ18n

sb hermit
Dec 13, 2016





matti posted:

i do not want to configure an irc client to ask elsewhere so i will do it here: is there already something that does this before i reinvent a wheel: https://pastebin.com/pg0jJ18n

There is no explicit program that does this. This is because "tofile" behavior is not very unix-correct. There are two issues at play:
  • if you wait until a program finishes before you write the file, you end up keeping all that stuff in memory. This could be a problem with large files.
  • If you write the file before the program finishes, but delete it if there's an error, it could lead to a race condition (independent of Makefiles... as in, if a program is monitoring a directory for those files, it could act before stuff is written out). Also, if tofile itself encounters an error or is killed prematurely (or if someone just flips the power) then there is no way to delete the target file.

What I think most Makefiles would do is redirect the output to a temporary file. Then, upon success, rename that temporary file to the real target name. Of course, if you error out a lot, you could have a lot lf temporary files. So you'll probably want a "clean" rule in your makefile that cleans all that stuff out.

EDIT: cleaned up some stuff

sb hermit fucked around with this message at 19:02 on Oct 9, 2021

jesus WEP
Oct 17, 2004


psiox posted:

whoops, i dropped my huge pages i use for my magnum address space
lmao

Sapozhnik
Jan 2, 2005

Nap Ghost
There's cleaner ways of doing it on Linux; you can open(2) a _directory_ using O_TMPFILE, which will give you a fd for a new anonymous file on that directory's filesystem but not link it anywhere. Then you can use linkat(2) to atomically link this file into a directory. I don't think there is an equivalent feature on macOS since this all uses POSIX extensions but if you're voluntarily doing development work on macOS then you deserve all the suffering that decision entails.

I'm not aware of any tool that does what you describe though.

matti
Mar 31, 2019

yeah i figure i will create a temporary file in the same directory and do a link and unlink on success

worst case if the program aborts you are left with a garbage file but that is not really an issue because realistically that happens never

infernal machines
Oct 11, 2012

we monitor many frequencies. we listen always. came a voice, out of the babel of tongues, speaking to us. it played us a mighty dub.

matti posted:

but that is not really an issue because realistically that happens never

famous last words

alternatively, the linux principle

pseudorandom name
May 6, 2007

Sapozhnik posted:

There's cleaner ways of doing it on Linux; you can open(2) a _directory_ using O_TMPFILE, which will give you a fd for a new anonymous file on that directory's filesystem but not link it anywhere. Then you can use linkat(2) to atomically link this file into a directory. I don't think there is an equivalent feature on macOS since this all uses POSIX extensions but if you're voluntarily doing development work on macOS then you deserve all the suffering that decision entails.

I'm not aware of any tool that does what you describe though.

this requires CAP_DAC_READ_SEARCH

sb hermit
Dec 13, 2016





pseudorandom name posted:

this requires CAP_DAC_READ_SEARCH

my next username

Nitrousoxide
May 30, 2011

do not buy a oneplus phone



New thread title is a banger

Soricidus
Oct 21, 2010
freedom-hating statist shill

sb hermit posted:

if you wait until a program finishes before you write the file, you end up keeping all that stuff in memory. This could be a problem with large files.

if your makefile is creating files that won’t fit comfortably in ram, either you need to stop abusing make to do big data analytics or you need to stop building your code on a toaster

Cybernetic Vermin
Apr 18, 2005

Soricidus posted:

if your makefile is creating files that won’t fit comfortably in ram, either you need to stop abusing make to do big data analytics or you need to stop building your code on a toaster

generating large files is not "abusing" make, it was created at a time when it was very likely that a lot of compilation tasks would be one-pass precisely because just stuffing output into memory was if not impossible at least incredibly wasteful. but beyond that it is dumb as hell to suggest that you should go out and create bespoke tools when literally no aspect of make depends on (or is even aware of) file sizes

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:
Gonna quote the loving manual:

The loving manual posted:

In fact, make is not limited to programs. You can use it to describe any task where some files must be updated automatically from others whenever the others change.

Imagine you have a directory where raw ground penetrating radar data is deposited. You want to process those files, but you are always going to keep the originals. Make is absolutely perfect for this. With a Makefile of a few lines you can handle processing and cleanup and get parallel processing for free.

Pythagoras a trois
Feb 19, 2004

I have a lot of points to make and I will make them later.
Well, there is the considerable expense of having to work with make

Soricidus
Oct 21, 2010
freedom-hating statist shill

Cybernetic Vermin posted:

generating large files is not "abusing" make, it was created at a time when it was very likely that a lot of compilation tasks would be one-pass precisely because just stuffing output into memory was if not impossible at least incredibly wasteful. but beyond that it is dumb as hell to suggest that you should go out and create bespoke tools when literally no aspect of make depends on (or is even aware of) file sizes

that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity.

Cybernetic Vermin
Apr 18, 2005

Soricidus posted:

that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity.

writing a program that sticks incoming data in memory and then writes it to a file is not less complicated than just writing the data to a file, and if you want to make the write happen atomically (which i think is a very good idea from op) you either way have to go through a bit more work on the file management side.

sb hermit
Dec 13, 2016





it turns out that Make is good, and people choose to use it all the time

the fact that it can be used incorrectly, or that incomprehensible Makefiles are routinely generated by autotools, doesn't change that fact.

I will admit that having the manual can be helpful if you're not familiar with implicit rules, though.

sb hermit
Dec 13, 2016





Soricidus posted:

that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity.

The fact of the matter is that if you don't know how large the data stream is, then it makes more sense to just write it out to disk (where it would be anyway, if the program was successful) rather than buffering it in memory and potentially eating many megabytes or gigabytes and maybe hitting swap and causing a VM to crash or whatever.

There are times when you can't help but keep it all in memory (such as if you're using the sort command). But given the utility's requirements, using up precious memory is unnecessary.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

sb hermit posted:

it turns out that Make is good, and people choose to use it all the time

the fact that it can be used incorrectly, or that incomprehensible Makefiles are routinely generated by autotools, doesn't change that fact.

I will admit that having the manual can be helpful if you're not familiar with implicit rules, though.

nah

the thing make implements is cool and good, but the way make does it is very bad

makefile syntax is steaming hot rotting garbage, and that's just the tip of the iceberg

the main reasons people choose make are ubiquity, inertia, and the lack of a single obvious successor project

The_Franz
Aug 8, 2003

sb hermit posted:

it turns out that Make is good, and people choose to use it all the time

people also choose to use obtuse, antiquated text editors like vi/vim/emacs. that doesn't make them good compared to modern tools

cmake or meson (meson is the nicer high-level build system imo, but cmake has the inertia and ecosystem advantage) spitting out ninja at the low level is how things are done in the 21st century

dougdrums
Feb 25, 2005
CLIENT REQUESTED ELECTRONIC FUNDING RECEIPT (FUNDS NOW)
wrote a make target with $(make) in it the other day. it felt dirty

Nomnom Cookie
Aug 30, 2009



the Makefiles i see at work have 100% phony rules. this bothers me

The_Franz
Aug 8, 2003

somehow i've made it this far in life never having to write a makefile from scratch, just occasionally beat on existing autotools scripts

psiox
Oct 15, 2001

Babylon 5 Street Team

Nomnom Cookie posted:

the Makefiles i see at work have 100% phony rules. this bothers me

oh no my makes files are the same because no real file-based artifacts are generated :(

spankmeister
Jun 15, 2008






I wrote makefiles for compiling LaTeX documents to pdf. I would put everything in a git repo and had a hand-crafted gitignore to exclude all the intermediate files and stuff.

then sharelatex and overleaf came along and made all that stuff obsolete overnight.

Progressive JPEG
Feb 19, 2003

The_Franz posted:

people also choose to use obtuse, antiquated text editors like vi/vim/emacs. that doesn't make them good compared to modern tools

cmake or meson (meson is the nicer high-level build system imo, but cmake has the inertia and ecosystem advantage) spitting out ninja at the low level is how things are done in the 21st century

on the other hand, c/cpp developers don't deserve good tooling

Best Bi Geek Squid
Mar 25, 2016
just use a modern language like python, op

matti
Mar 31, 2019

i wrote the utility and it works well enough for my own purposes 🤷

Pythagoras a trois
Feb 19, 2004

I have a lot of points to make and I will make them later.

spankmeister posted:

I wrote makefiles for compiling LaTeX documents to pdf. I would put everything in a git repo and had a hand-crafted gitignore to exclude all the intermediate files and stuff.

then sharelatex and overleaf came along and made all that stuff obsolete overnight.

there is virtue in making something and then letting it go

Soricidus
Oct 21, 2010
freedom-hating statist shill

sb hermit posted:

The fact of the matter is that if you don't know how large the data stream is, then it makes more sense to just write it out to disk (where it would be anyway, if the program was successful) rather than buffering it in memory and potentially eating many megabytes or gigabytes and maybe hitting swap and causing a VM to crash or whatever.

There are times when you can't help but keep it all in memory (such as if you're using the sort command). But given the utility's requirements, using up precious memory is unnecessary.

writing data to files to avoid using up the ~precious memory~ that my os could otherwise be using to avoid writing the files to disk

by all means write your data to files if that’s the simplest way to do your thing. just don’t jump through hoops trying to avoid keeping it in memory, because it’s 2021 and running out of memory is an edge case

animist
Aug 28, 2018
unless you're running on kooberneetus and only have a few hundred mb of headspace

treating streaming data as streams isn't bad advice lol

(although if you're running make in kubernetes you have other problems)

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

flashpoll, which one would you rather see in a script:

(1) cat butt | grep fart | cut -d' ' -f1

or

(2) cat butt | awk '/fart/ { print $1 }'

?

FlapYoJacks
Feb 12, 2009
awk is better than cut. :colbert:

edit

meson > cmake. :colbert:

Adbot
ADBOT LOVES YOU

mycophobia
May 7, 2008
1 because i sort of understand it at a glance

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply