|
lwn posted:He also, however, noted that he wasn't entirely happy with the "folio" name, thus touching off one of the more predictable dynamics of kernel-community discussions: when the technical side gets involved and intractable, it must be time to have an extended debate on naming. So David Howells suggested "sheaf" or "ream". Torvalds favored something more directly descriptive, like "head_page". Ted Ts'o thought "mempages" would work, or maybe "pageset". Nicholas Piggin favored "cluster" or "superpage". Given the discussion, Vlastimil Babka concluded that the only suitable name was "pageshed". lmao
|
# ? Oct 6, 2021 11:17 |
|
|
# ? Apr 29, 2024 01:07 |
|
pageshed would be a good band name
|
# ? Oct 6, 2021 12:59 |
|
I use huge pages regularly for database software and having to talk about it out loud always seems silly. my memory pages. they’re freaking huge
|
# ? Oct 6, 2021 13:21 |
|
whoops, i dropped my huge pages i use for my magnum address space
|
# ? Oct 6, 2021 19:15 |
|
huge pages and this compound page thing they're talking about on lkml are completely different things
|
# ? Oct 6, 2021 20:21 |
|
psiox posted:whoops, i dropped my huge pages i use for my magnum address space
|
# ? Oct 6, 2021 21:01 |
|
psiox posted:whoops, i dropped my huge pages i use for my magnum address space lol
|
# ? Oct 6, 2021 22:35 |
|
i do not want to configure an irc client to ask elsewhere so i will do it here: is there already something that does this before i reinvent a wheel: https://pastebin.com/pg0jJ18n
|
# ? Oct 9, 2021 18:24 |
|
matti posted:i do not want to configure an irc client to ask elsewhere so i will do it here: is there already something that does this before i reinvent a wheel: https://pastebin.com/pg0jJ18n There is no explicit program that does this. This is because "tofile" behavior is not very unix-correct. There are two issues at play:
What I think most Makefiles would do is redirect the output to a temporary file. Then, upon success, rename that temporary file to the real target name. Of course, if you error out a lot, you could have a lot lf temporary files. So you'll probably want a "clean" rule in your makefile that cleans all that stuff out. EDIT: cleaned up some stuff sb hermit fucked around with this message at 19:02 on Oct 9, 2021 |
# ? Oct 9, 2021 18:58 |
|
psiox posted:whoops, i dropped my huge pages i use for my magnum address space
|
# ? Oct 9, 2021 19:05 |
|
There's cleaner ways of doing it on Linux; you can open(2) a _directory_ using O_TMPFILE, which will give you a fd for a new anonymous file on that directory's filesystem but not link it anywhere. Then you can use linkat(2) to atomically link this file into a directory. I don't think there is an equivalent feature on macOS since this all uses POSIX extensions but if you're voluntarily doing development work on macOS then you deserve all the suffering that decision entails. I'm not aware of any tool that does what you describe though.
|
# ? Oct 9, 2021 19:06 |
|
yeah i figure i will create a temporary file in the same directory and do a link and unlink on success worst case if the program aborts you are left with a garbage file but that is not really an issue because realistically that happens never
|
# ? Oct 9, 2021 19:20 |
|
matti posted:but that is not really an issue because realistically that happens never famous last words alternatively, the linux principle
|
# ? Oct 9, 2021 19:22 |
|
Sapozhnik posted:There's cleaner ways of doing it on Linux; you can open(2) a _directory_ using O_TMPFILE, which will give you a fd for a new anonymous file on that directory's filesystem but not link it anywhere. Then you can use linkat(2) to atomically link this file into a directory. I don't think there is an equivalent feature on macOS since this all uses POSIX extensions but if you're voluntarily doing development work on macOS then you deserve all the suffering that decision entails. this requires CAP_DAC_READ_SEARCH
|
# ? Oct 9, 2021 19:23 |
|
pseudorandom name posted:this requires CAP_DAC_READ_SEARCH my next username
|
# ? Oct 9, 2021 20:54 |
New thread title is a banger
|
|
# ? Oct 9, 2021 22:58 |
|
sb hermit posted:if you wait until a program finishes before you write the file, you end up keeping all that stuff in memory. This could be a problem with large files. if your makefile is creating files that won’t fit comfortably in ram, either you need to stop abusing make to do big data analytics or you need to stop building your code on a toaster
|
# ? Oct 10, 2021 01:37 |
|
Soricidus posted:if your makefile is creating files that won’t fit comfortably in ram, either you need to stop abusing make to do big data analytics or you need to stop building your code on a toaster generating large files is not "abusing" make, it was created at a time when it was very likely that a lot of compilation tasks would be one-pass precisely because just stuffing output into memory was if not impossible at least incredibly wasteful. but beyond that it is dumb as hell to suggest that you should go out and create bespoke tools when literally no aspect of make depends on (or is even aware of) file sizes
|
# ? Oct 10, 2021 08:57 |
|
Gonna quote the loving manual:The loving manual posted:In fact, make is not limited to programs. You can use it to describe any task where some files must be updated automatically from others whenever the others change. Imagine you have a directory where raw ground penetrating radar data is deposited. You want to process those files, but you are always going to keep the originals. Make is absolutely perfect for this. With a Makefile of a few lines you can handle processing and cleanup and get parallel processing for free.
|
# ? Oct 10, 2021 11:36 |
|
Well, there is the considerable expense of having to work with make
|
# ? Oct 10, 2021 11:51 |
|
Cybernetic Vermin posted:generating large files is not "abusing" make, it was created at a time when it was very likely that a lot of compilation tasks would be one-pass precisely because just stuffing output into memory was if not impossible at least incredibly wasteful. but beyond that it is dumb as hell to suggest that you should go out and create bespoke tools when literally no aspect of make depends on (or is even aware of) file sizes that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity.
|
# ? Oct 10, 2021 12:21 |
|
Soricidus posted:that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity. writing a program that sticks incoming data in memory and then writes it to a file is not less complicated than just writing the data to a file, and if you want to make the write happen atomically (which i think is a very good idea from op) you either way have to go through a bit more work on the file management side.
|
# ? Oct 10, 2021 13:20 |
|
it turns out that Make is good, and people choose to use it all the time the fact that it can be used incorrectly, or that incomprehensible Makefiles are routinely generated by autotools, doesn't change that fact. I will admit that having the manual can be helpful if you're not familiar with implicit rules, though.
|
# ? Oct 10, 2021 22:55 |
|
Soricidus posted:that’s what I meant with the toaster part. what was important 50 years ago is not always relevant today. if your task is constrained by memory, it makes more sense to add memory than to add complexity. The fact of the matter is that if you don't know how large the data stream is, then it makes more sense to just write it out to disk (where it would be anyway, if the program was successful) rather than buffering it in memory and potentially eating many megabytes or gigabytes and maybe hitting swap and causing a VM to crash or whatever. There are times when you can't help but keep it all in memory (such as if you're using the sort command). But given the utility's requirements, using up precious memory is unnecessary.
|
# ? Oct 10, 2021 23:03 |
|
sb hermit posted:it turns out that Make is good, and people choose to use it all the time nah the thing make implements is cool and good, but the way make does it is very bad makefile syntax is steaming hot rotting garbage, and that's just the tip of the iceberg the main reasons people choose make are ubiquity, inertia, and the lack of a single obvious successor project
|
# ? Oct 10, 2021 23:42 |
|
sb hermit posted:it turns out that Make is good, and people choose to use it all the time people also choose to use obtuse, antiquated text editors like vi/vim/emacs. that doesn't make them good compared to modern tools cmake or meson (meson is the nicer high-level build system imo, but cmake has the inertia and ecosystem advantage) spitting out ninja at the low level is how things are done in the 21st century
|
# ? Oct 11, 2021 00:15 |
|
wrote a make target with $(make) in it the other day. it felt dirty
|
# ? Oct 11, 2021 00:18 |
|
the Makefiles i see at work have 100% phony rules. this bothers me
|
# ? Oct 11, 2021 02:34 |
|
somehow i've made it this far in life never having to write a makefile from scratch, just occasionally beat on existing autotools scripts
|
# ? Oct 11, 2021 02:45 |
|
Nomnom Cookie posted:the Makefiles i see at work have 100% phony rules. this bothers me oh no my makes files are the same because no real file-based artifacts are generated
|
# ? Oct 11, 2021 02:57 |
|
I wrote makefiles for compiling LaTeX documents to pdf. I would put everything in a git repo and had a hand-crafted gitignore to exclude all the intermediate files and stuff. then sharelatex and overleaf came along and made all that stuff obsolete overnight.
|
# ? Oct 11, 2021 03:49 |
|
The_Franz posted:people also choose to use obtuse, antiquated text editors like vi/vim/emacs. that doesn't make them good compared to modern tools on the other hand, c/cpp developers don't deserve good tooling
|
# ? Oct 11, 2021 08:28 |
|
just use a modern language like python, op
|
# ? Oct 11, 2021 18:20 |
|
i wrote the utility and it works well enough for my own purposes 🤷
|
# ? Oct 11, 2021 20:33 |
|
spankmeister posted:I wrote makefiles for compiling LaTeX documents to pdf. I would put everything in a git repo and had a hand-crafted gitignore to exclude all the intermediate files and stuff. there is virtue in making something and then letting it go
|
# ? Oct 12, 2021 06:11 |
|
sb hermit posted:The fact of the matter is that if you don't know how large the data stream is, then it makes more sense to just write it out to disk (where it would be anyway, if the program was successful) rather than buffering it in memory and potentially eating many megabytes or gigabytes and maybe hitting swap and causing a VM to crash or whatever. writing data to files to avoid using up the ~precious memory~ that my os could otherwise be using to avoid writing the files to disk by all means write your data to files if that’s the simplest way to do your thing. just don’t jump through hoops trying to avoid keeping it in memory, because it’s 2021 and running out of memory is an edge case
|
# ? Oct 12, 2021 11:07 |
|
unless you're running on kooberneetus and only have a few hundred mb of headspace treating streaming data as streams isn't bad advice lol (although if you're running make in kubernetes you have other problems)
|
# ? Oct 12, 2021 14:19 |
|
flashpoll, which one would you rather see in a script: (1) cat butt | grep fart | cut -d' ' -f1 or (2) cat butt | awk '/fart/ { print $1 }' ?
|
# ? Oct 12, 2021 16:54 |
|
awk is better than cut. edit meson > cmake.
|
# ? Oct 12, 2021 16:56 |
|
|
# ? Apr 29, 2024 01:07 |
|
1 because i sort of understand it at a glance
|
# ? Oct 12, 2021 16:58 |