Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
m0nk3yz
Mar 13, 2002

Behold the power of cheese!


What is Python? (from http://www.python.org)
Python is a dynamic object-oriented programming language that can be used for many kinds of software development. It offers strong support for integration with other languages and tools, comes with extensive standard libraries, and can be learned in a few days. Many Python programmers report substantial productivity gains and feel the language encourages the development of higher quality, more maintainable code.

Official Website: http://www.python.org/
Official Documentation: http://docs.python.org/
Python Quick references: http://rgruet.free.fr/
Semi-Official Python FAQ: http://effbot.org/pyfaq/
Official Python Cheeseshop: http://pypi.python.org/pypi
Official Python Wiki: http://wiki.python.org/moin/
Python development tools, including editors: http://wiki.python.org/moin/DevelopmentTools
Official Planet Python (weblogs): http://planet.python.org/
Unofficial Planet Python (more weblogs): http://www.planetpython.org/
Python Magazine: http://pythonmagazine.com/
Norvig's Infrequently asked questions: http://norvig.com/python-iaq.html - This man exemplifies elegant code. His solutions are just... beautiful.

Python tutorials and free online books:

What version should I use? Python 2.x or 3.x?
http://wiki.python.org/moin/Python2orPython3 - The offical website tells all

Alternative Python interpreter implementations

Python: Myths about Indentations: http://www.secnetix.de/~olli/Python/block_indentation.hawk

Python Editors and IDEs: http://wiki.python.org/moin/PythonEditors (Although you should use VIM http://sontek.net/turning-vim-into-a-modern-python-ide)

More about Python web frameworks than you could ever want to know: http://wiki.python.org/moin/WebFrameworks - though, you should just stick to Django, or Flask. Seriously!

PyCon US!
PyCon US is the largest annual gathering for the community using and developing the open-source Python programming language. PyCon is organized by the Python community for the community. We try to keep registration far cheaper than most comparable technology conferences, to keep PyCon accessible to the widest group possible. http://us.pycon.org/2012/

There are many PyCons - or Python Conferences - throughout the world. You can find one, should the US one not be your cup of tea at: http://www.pycon.org/

m0nk3yz posted:

Why I like python: I enjoy both it's syntax, dynamic (latent) typing, standard library and truly enjoy programming in the language. I've been hacking in python for about (edit) 7 or 8 years at this point, and I have come to really enjoy the community and ecosystem around the language as well. Python is expressive and while I've run into bits of python code that make me WTF, I've rarely run into a chunk I couldn't read, or understand - you have to go out of your way (or be abusive with say, list comprehensions) to make python unreadable.

And just to pimp my own "Good to great Python reads": http://jessenoller.com/good-to-great-python-reads/

TasteMyHouse posted:

Since python is often recommended as a beginner's language (especially here in CoC) it'd be nice for there to be a short discussion / link to a discussion about interpreted vs compiled in general, the distinction between CPython and Python itself, etc. I know that stuff kind of confused me when I was looking at python and didn't know poo poo from poo poo.
http://programmers.stackexchange.com/questions/24558/python-interpreted-or-compiled
http://en.wikipedia.org/wiki/Interpreted_language
http://en.wikipedia.org/wiki/Python_(programming_language)

m0nk3yz fucked around with this message at 04:26 on Aug 27, 2011

Adbot
ADBOT LOVES YOU

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Bonus posted:

Yeah I agree, Python is by far my favourite language. It's concise yet readable. It's just a joy to work in, I love how the modules work, the duck typing is great, all in all I really like it. Well, everything except the GIL in CPython, which basically prevents Python from running on multiple cores concurrently. However, there's a great module called Parallel Python which lets you do just that.

That's only partially true: Python can use multiple cores, python threads can not (sort of, unless they're caught in a c module that's released the gil or blocking I/O), as shown by effbot's recent wide-finder experiment (http://effbot.org/zone/wide-finder.htm). If you fork processes you can easily make use of multiple cores. Of course, fork/exec means you loose the shared context of pthreads.

I personally prefer the Processing (http://pypi.python.org/pypi/processing/) module over the parallel python module, it's thread-API compatible which at least for most of my applications means it is a drop-in replacement.

As for the GIL, yes, the cPython interpreter has a GIL which is primarily a Python Dev feature, however there are talks and projects to add something like Parallel Python/Processing.py to the stdlib (I'm considering doing a PEP for Processing) and adding a patch set to Python 3000 that allows for free-threading.

More on the GIL:
GvR: http://www.artima.com/weblogs/viewpost.jsp?thread=214235
Brett Cannon: http://sayspy.blogspot.com/2007/11/idea-for-process-concurrency.html
Bruce Eckel: http://www.artima.com/weblogs/viewpost.jsp?thread=214303

m0nk3yz fucked around with this message at 01:45 on Nov 5, 2007

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

N.Z.'s Champion posted:

After hating on the XML processing in Python (4suite is sluggish and buggy) I found LXML and P4X. So use them unless you want to break your brain.

Which version of Python were you using? For my XML needs I've loved elementtree (and it's brother, cElementtree). Elementtree is now in the stdlib as of 2.5.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Just to share, here's some good reading on Python Metaclasses:

http://markshroyer.com/blog/2007/11/09/tilting-at-metaclass-windmills/
http://www.ibm.com/developerworks/linux/library/l-pymeta.html
http://en.wikibooks.org/wiki/Programming:Python_MetaClasses

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

deimos posted:

http://effbot.org/librarybook/

I posted it earlier but OP hasn't added it.

It was/is added

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

deimos posted:

I am blind, sorry.


Now more on topic: woop, properties for 2.6 and 3k: http://bugs.python.org/issue1416

Nice, I totally missed that one. I am quite happy with what's coming down the pipe for py3k

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

duck monster posted:

Ah. Nice. Much more elegant than the get_foo set_foo stuff.

I'm really keen on this decorator stuff. Lots of oportunities for aspect coding type stunts.

Speaking of decorators, this popped up on pypi this morning - simple threaded/threadpool decorators:
http://pypi.python.org/pypi/threadec/

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Hammertime posted:

I'm relatively python ignorant, I've only been learning it for a week or so ...
What's the point of threading in python with the GIL, excluding heavy I/O situations?
..snip

I actually just got done writing a fairly lengthy article talking about this (the gil+threads, etc). Here's the short version.

Python, in the cPython interpreter uses OS-level threads (pthreads for unix people) and due to this, is perfectly capable of leveraging multiple cores. However, this is blocked by the GIL, which prevents any single thread from being executed within the interpreter at once, which make the interpreter (and making extensions) simple as you don't have to manage threads/memory - the core interpreter and the GIL will do that for you.

The GIL is there to protect the interpreter's memory - not to hinder users or programmers. Ultimately, it is a cPython interpreter developer feature, not something which benefits end users (unless you are writing c-extensions). Which neatly deltas to the next point - any python code which is threaded, and only calls python code - i.e: number crunching - will run as fast (slightly slower) than the same code written single-threaded.

However, if the code you are writing uses any c-code (and this includes built-in modules written in c) which is thread-safe and uses the PyGILState_STATE/Py_BEGIN_ALLOW_THREADS macros your code will spread across processors and run faster than the single threaded implementation.

For example, 2 threads running a fibonacci calculation will run slightly slower than running the same calculation single threaded - however 2 threads performing disk I/O or socket calls will run much faster than the single threaded example.

That all being said - the GIL is a feature for the interpreter developers, and it's been stated by many people that removing the GIL will make life quite difficult for a lot of contributors and users. There has been work to remove it and I know of at least two people making a patch set for python 3000 to remove the GIL but keep the ease of use.

There are however, other options than to use the basic threading module and adding a time.sleep(.00001) (that forces the GIL to be released) and using I/O. There's actually many modules that not only mimic the threading API, but allow applications to distribute load across processors and machines.

Here's a "short list" of modules:
http://code.google.com/p/python-distributed/wiki/PythonPackagesOfNote

Check out the rest of the wiki for more information, of note is a quick run down of python GIL related articles:

http://code.google.com/p/python-distributed/wiki/PythonGILRelatedArticles

I tend to prefer the Processing module (http://pypi.python.org/pypi/processing/) given I can drop it in and quickly replace threading when I've run into scaling problems.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

fake posted:

I still don't really understand why the GIL helps interpreter developers. They can't write threadsafe code?

No, it means they don't have to worry about it being thread safe - no worries about memory/data corruption, deadlock issues, it keeps garbage collection simple and out of the way, etc. Yes, they can write thread safe code, but as people much smarter than I (Goetz and Eckel and others) threading is difficult to get dead-right and can lead to madness, especially when figuring out a testing matrix. Some could also argue that due to the simplicity of the C interface to python, there has been a much higher rate of third-party module developed for python. Even third party modules that bypass the GIL for concurrency.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

bitprophet posted:

Look, between that whitespace bullshit and this inability to handle threading natively, I don't see how anyone can consider Python anything but a toy language :colbert:

Dude, gently caress whitespace. http://timhatch.com/projects/pybraces/

Also, I pre-ordered the django book tonight.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Also note EpyDoc - it's above and beyond pydoc http://epydoc.sourceforge.net/

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Hammertime posted:

Edit2: Well, it removes much of python's threading overhead. Doesn't solve the GIL problem though, still bound to a single core. Neat idea, seems a big waste of effort though, unless I've missed something. :(

You're correct, stackless just removes the stack: Not the Global Interpreter Lock. As has been pointed out previously, if you're trying to sidestep the GIL, look at the module I (and others) have linked too. If you're worried about the GIL + Web applications ala Django: Don't just yet. When Django Apps are served up via mod_python within Apache, the model is wildly different (given apache does use multiple cpus/cores). Most of the time in web apps you're also i/o (not cpu) bound.

m0nk3yz fucked around with this message at 13:18 on Nov 19, 2007

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

FuraxVZ posted:

I'm looking to get into Python a bit; since I learn best from paper books, any recommendations? And yes, yes, I know, you learn by doing not reading, but I like books for grokking languages. Something with some good depth and the spirit of the language would be great.

The online Diving into Python is pretty good, and I was thinking of getting the dead-tree version. Is there better?

The latest edition of Core Python Programming is really good. Covers 2.5 to boot.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
I use textmate and Eclipse+Pydev. Textmate when I'm not working on our huge work-projects in house (autocomplete is a must on large projects).

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
If you want an excellent newforms walkthrough, check out the intro to newforms posts on James Bennett's blog:

http://www.b-list.org/weblog/2007/nov/22/newforms/
http://www.b-list.org/weblog/2007/nov/23/newforms/
http://www.b-list.org/weblog/2007/nov/25/newforms/

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

No Safe Word posted:

:psyduck:

Dangit, he asked though. He should have read the thread! :argh:

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Casao posted:

I figure it's worth pointing out that PyObjC, which allows you to program native Coacoa apps for OSX ,now comes standard with Leopard and XCode, making Mac Python programming a bigger joy.

One quick note, if you are running 2.5.1 on Leopard and want to new pyobjc stuff from subversion to compile, you have to edit Python.framework/Versions/2.5/lib/python2.5/config/Makefile and remove the "-isysroot /Developer/SDKs/MacOSX10.4u.sdk" chunk from it. Otherwise, the new pobjc stuff won't compile.

code:
Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) 
[GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import WebKit
>>> 

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
FYI, Pycon registration is now open! http://us.pycon.org/2008/about/

I'll more than likely be going, but skipping the Tutorials day.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Here's an interesting post from Guido flying around the 'tubes:

http://mail.python.org/pipermail/python-dev/2008-January/076194.html

I personally like it as a "hey that's neat for testing" idea - mock objects not withstanding. I like the followup from Robert Brewer where he saves an old copy of the method aside:

http://mail.python.org/pipermail/python-dev/2008-January/076198.html

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

w_hat posted:

From one error to another:
Traceback (most recent call last):
File "C:\Python25\cpumon.py", line 44, in <module>
except pywintypes.error:
NameError: name 'pywintypes' is not defined

I should probably get around to actually learning Python instead of hacking it together.

Interesting you should say this now - Someone just posted an excellent Python reading list here: http://www.wordaligned.org/articles/essential-python-reading-list

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

porkface posted:

You guys know you can safely run multiple versions of Python on the same machine? You just have to set them up right, and choose which version you want to be your generic system-wide install.

This man speaks truth: I have several versions installed on my mac. I just twiddle my bash_profile to point to the one I want.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Chutwig Regarding smtplib - did you look at this thread on c.l.p?

http://groups.google.com/group/comp...a1de3ed29133546

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Figured I'd ask - any Boston/Boston Metro area Python people (or people getting into Python) looking around for a Job()?

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

deimos posted:

Things to add to the op:

Implementations
tinypy - http://www.tinypy.org/ - Sexy small implementation that compiles to C
pypy - http://codespeak.net/pypy/dist/pypy/doc/home.html - Almost self-hosting python implementation? Not sure.


Links:
Norvig's Infrequently asked questions - http://norvig.com/python-iaq.html - This man exemplifies elegant code. His solutions are just... beautiful.

Added

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Scaevolus posted:

I thought the Global Interpreter Lock made this form of threading pointless? Or are you doing multiple processes?

You can use parallel python (http://www.parallelpython.com/) or pyprocessing (http://pyprocessing.berlios.de/) to easily side-step the GIL. My particular favorite is the latter (pyprocessing). In fact I am working on a pep and with python-dev to see if I can get pyprocessing into the stdlib for 2.6. The pyprocessing module is a drop-in replacement for the threading module (it's API compliant)

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
The processing PEP is now live (draft form): http://www.python.org/dev/peps/pep-0371/

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

deimos posted:

I just realized I've been reading m0nk3yz's blog for a while if that's his PEP.

It is, and I hope it's been a decent read, although I've been heads down with the new job/PEP/python magazine work lately.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
And because tuples are immutable, they can be used as dictionary keys!

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

tripwire posted:

This is a really cool post; I have a somewhat dumb followup question that you or someone else might be able to answer for me.

One of the things the pypy people mentioned was improving the performance and portability of the interpreter to the point where it would be faster and more compatible than ironpython/cpython/jython; specifically they mentioned picking up where psyco left off in terms of speeding up cpu-bound code. I realize one of the stated goals of pypy over psyco is that its not as dependant on architecture, but will pypy fully support 64-bit processors/code, and how easily will it mesh with parallel python or pyprocessing? I never even bothered to download and compile python in 32-bit compatibility mode to take advantage of psyco, and if I did I suspect it wouldn't play nice with parallel python, although I haven't tried it yet.
In a nutshell, does anyone know if the devs of pypy going to make it work out of the box on multicore systems on a 64-bit os?

I think the problem right now for PyPy is turning itself from little more than a science project into something more concrete. Right now they're still fiddling around with a lot of stuff (see: http://morepypy.blogspot.com/) and either they're lacking resources or focus. Of course, maybe I'm too focused on shipping stuff.

As for how well it would mesh with PP and PyProcessing: We'll see. It looks like my Pep (371) is getting accepted, and some of the ramifications of that may be some work on my part (or others) to port the module to jython/ironpython/pypy - but given the lack of GIL on the first two, it may not need to be ported.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
PEP 371 has been accepted/final!

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Scaevolus posted:

Python 2.6b1 was released, changelog here

Things of note:

- Issue #2831: enumerate() now has a ``start`` argument.
- Issue #2138: Add factorial() the math module.
- Added the multiprocessing module, PEP 371.

Yeah, just note I boned the patch for the multiprocessing module and forgot to add it a portion to the makefile see http://bugs.python.org/issue3150

rant: The fact I have to add a chunk to setup.py in trunk *and* edit a makefile *and* twiddle some of the document indexes to add a single package sucks.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

tripwire posted:

m0nk3yz, I know you worked on pyprocessing so perhpas you can answer a stupid question. I'm trying to parallelize a serial python program; in the program there is a list of "chromosome" objects which are supposed to get a unique id when they are instanced. The relevant code for it looks like this:
code:

class Chromosome(object):
    _id = 0

    def __init__(self, some arguments..):
        self._id = self.__get_new_id()
        .
        .
        .
    id = property(lambda self: self._id)

    @classmethod
    def __get_new_id(cls):
        cls._id += 1
        return cls._id

My question is, what is the best/most elegant way to make sure that concurrent processes don't produce id's which collide with any other process?

Since new chromosomes are going to be generated constantly in a loop, should I give up just incrementing a counter to generate unique id's? Would it be better to rework the __get_new_id function such that it generates a pseudo-random value with a very low chance of collision? Would I be better off trying to make sure each concurrent process only access a central id factory in synchronization? There is probably a really simple solution but I'm too dumb to figure it out and I don't want to rush into trying to code without knowing what I'm trying to accomplish.

You have a couple of ways of doing this - you have already touched on one which is to change __get_new_id to generate pseudo-random integers for each of the objects and basically making it a shared objects with locks/etc (see processing.RLock/etc). In my code, I generate empty objects with unique IDs ahead of time (think "BlankObject.id") and pump a good amount of them into a shared queue which I then pass into the processes (processing.Queue).

The bad thing about my approach is that you could run out of blank objects, so I have to keep a producer in the background pumping in new objects so the workers generating the objects always have new blanks. In your case it does make sense to make a new shared object which essentially generates the unique IDs. In my case, I could also just fill a queue with unique objects - or subclass processing.Queue and overwrite the get() method to generate batches of IDs if the queue is empty.

A simpler approach is to pick a seed and pass it to the child processes so they can in turn pass it to a random call - you have a pretty low chance of collisions with random especially if in your ID you include some other attribute of the object. In another implementation, each object I spawned used a random number (generated from a seed) + 2-3 other attributes of the object being created.

A few things to think about : When using processing.Queue, you pay a serialization and deserialization cost for things coming in and out of the Queue. The same goes for the cost of lock acquisition and releasing, it really depends on where you want to take the hit. If you go with the "shared object generating the ids" approach - that object is going to have to keep an ever-growing list of IDs it's handed out so that it really does ensure that there isn't a conflict.

Random thought: use a seed passed to random and the machine-time (time.time) to generate the IDs

m0nk3yz fucked around with this message at 14:14 on Jun 24, 2008

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Bozart posted:

Why not just use the incremented IDs, and have each process remember its process ID. If you want to combine the results, you can create unique IDs then from the PID and innovation number. Also, I am not sure if NEAT would work if the innovation numbers were not strictly increasing, but I could be wrong. I would mainly just be way too lazy to go through the code and see what I would have to change to use random IDs instead of incremented ones, but to each their own.

Good idea - also, each process and thread inside of python does support get name / get id calls - you can even name them anything you want (say, a unique seed for each one which allows you to make unique names for each process namespace)

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Ok, so a question for all of you. I'm moving forward with a project to design a ton of function tests in Nose - the problem being, each test needs to be "told" about the attributes of the system under test (ip addresses, etc). I can do a few things:
  • use plain text name=value, and have nose parse it and pass it into the tests
  • use YAML (mainly for object instantiation and more advanced stuff) and have the tests (or nose) parse it and pass it in. The bonus here is that I automagically get objects of the specified type, and I don't need additional parsing/coercion.
  • Use a ConfigParser style file, and do the type parsing/coercion myself.
  • Write a test-configuration plugin for nose to do all of this and pass it into the tests. Make the underlying data format irrelevant to the tests.
I'm open to thoughts - just weighing the various options before running off and doing it.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
For those of you using nose, I uploaded a new configuration data plugin yesterday:

http://pypi.python.org/pypi/nose-testconfig/

It's a quick and dirty method of passing configuration data down to tests from within the context of nose.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!
Yes, right now (provided I stop breaking the build) the 2.6 and 3.0 releases are "on track"

Download and test the latest betas people!

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

bitprophet posted:

Not a bad idea (as long as you use the built-in "runserver" -- unless you're already a web dev, don't get bogged down with installing Apache or anything), doing the Django tutorial and then making a simple site will expose you to a lot of real-world Python very quickly :)

Apache deployment may cause your hair to turn grey or fall out. I'm almost positive my Dr. needs to put me on blood pressure meds after I fought with it on OS/X.

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

duck monster posted:

Except that literally every unix on the market except for the mac has a coherent policy on package management. I love the mac, but its package management has more in common with loving slackware than anything modern. Download tarball, compile and pray the gods will be benevolant. And no, easy_install will usually light on fire the minute its got to compile something and it becomes apparent that Apple puts things in different places to bsd's or SysV's. Reminder there are about 5 distros of python out there for the mac, none of which are compatible.

Uh, my mac works great for me for installing python things - it's all I develop on. Also, if you're on leopard: You should not be installing any other python distribution other than the one it came with, unless it's to your home directory or some /opt location. Thou shalt not change the system python!

And just to add - virtualenv works great too for keeping things in local sandboxes and out of the main framework directories.

m0nk3yz fucked around with this message at 21:08 on Sep 29, 2008

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

outlier posted:

That's not true - there's no problem with compiling a framework build for OSX. I routinely maintain 3 different pythons on my Mac (2.3 for backwards, 2.4 for zope, 2.5 for work). And once you factor eggs in, installing packages is a snap. So your original point is true - python works great on the mac.

(With, admittedly, some exceptions. Over the years I've sometimes had problems with libraries like PIL and MySQLdb that have to be compiled, but always able to eventually hack around them. And I've had problems with these libraries under Redhat as well, so it's difficult to see OSX as deficit in this regard.)

Let me correct myself: I don't compile additional framework builds unless I need to - which is almost never, simply due to the fact I can install a perfectly functional build in my home directory and keep it completely self-contained.

As for compiling packages - this is true pretty much everywhere. People get pretty hung up on having packages with pre-compiled everything, but what happens if you need version 2 and your upstream only provides version 1 (which happens to me with ubuntu all the time) - I'm going to need to compile the darn thing. I guess I just don't see what the big deal about compiling this stuff.

Adbot
ADBOT LOVES YOU

m0nk3yz
Mar 13, 2002

Behold the power of cheese!

Bozart posted:

This, 1000 times.

IANAWU (I am not a windows user) - but during the 2.6 release process, I had some multiprocessing bugs that came up on windows I needed to break the vmware out for. I can not even begin to image the pain of regular windows users when it comes to the windows build process. I eventually gave up, winged the patch and deleted the VM. Also, gently caress mingw.

  • Locked thread