Download Microsoft C/C++ compiler for use with Python 2.6/2.7 ASAP

S

sturlamolden

It should also be mentioned that the Windows 7 SDK includes
vcbuild.exe, so it can be used to compile Visual Studio 2008 projects
(I'm going to try Python).

Not sure why I forgot to mention, but we can (or even should?) use
CMake to generate these project files. We don't need Visual Studio for
that.
 
T

Thomas Jollans

There are really three things of concern here:
a) operating system file handles, of type HANDLE (which is an unsigned
32-bit value); they are not contiguous, and stdin/stdout/stderr may
have arbitrary numbers
b) C runtime file handles, of type int. They are contiguous, and
stdin/stdout/stderr are 0/1/2.
c) C FILE*.

OS handles can be passed around freely within a process; across
processes, they lose their meaning

It's the data of types b) and c) that cause problems: the CRT handle 4
means different things depending on what copy of the CRT is interpreting it.

Ah, okay. On UNIX systems, of course, a) and b) are identical.
It's worse with FILE*: passing a FILE* of one CRT to the fread()
implementation of a different CRT will cause a segfault.


Since when is CPython managed code?

It's not managed code in the "runs on .net" sense, but in principle, it
is managed, in that garbage collection is managed for you.
 
S

sturlamolden

It's not managed code in the "runs on .net" sense, but in principle, it
is managed, in that garbage collection is managed for you.

I think you are confusing Python and C code.
 
T

Thomas Jollans

I think you are confusing Python and C code.

Or somebody's confusing something anyway

I meant "CPython based", which, in hindsight, might have not have been
clear from the grammatically obfuscated sentence I posted.
 
D

David Cournapeau

I'm also rather sure that it's pretty much impossible to have multiple C
libraries in one process on UNIX, but it's obviously quite possible on
Windows.

I am not sure why you think it is not possible. It is rare, though.
Are you telling me that file descriptors (it's a flippin int!) can't be
passed around universally on Windows??

Yes. The reason why it does not work on windows is because file
descriptor are not "native": in every unix I have heard of, file
descriptor are indexes in a table of the process "structure", hence
can be shared freely (no dependency on the C runtime structures). On
windows, I have heard they are emulated (the native one is the win32
file handle).

David
 
A

Alf P. Steinbach /Usenet

* sturlamolden, on 06.07.2010 19:35:
You have to be sure PyMem_Malloc is not an preprocessor alias for
malloc (I haven't chaecked).

Python 3.1.1, file [pymem.h]:

PyAPI_FUNC(void *) PyMem_Malloc(size_t);

#define PyMem_MALLOC(n) (((n) < 0 || (n) > PY_SSIZE_T_MAX) ? NULL \
: malloc((n) ? (n) : 1))

The problem with the latter that it seems that it's intended for safety but does
the opposite...

Perhaps (if it isn't intentional) this is a bug of the oversight type, that
nobody remembered to update the macro?


***


Except for the problems with file descriptors I think a practical interim
solution for extensions implemented in C could be to just link the runtime lib
statically. For a minimal extension this increased the size from 8 KiB to 49
KiB. And generally with MS tools the size is acceptably small.

I think that this would be safe because since the C API has to access things in
the interpreter I think it's a given that all the relevant functions delegate to
shared library (DLL) implementations, but I have not checked the source code.

As a more longterm solution, perhaps python.org could make available the
redistributables for various MSVC versions, and then one could introduce some
scheme for indicating the runtime lib dependencies of any given extension. Then
when installing an extension the installer (distutils package functionality)
could just check whether the required runtime is present, and if not give the
user the choice of automatically downloading from python.org, or perhaps direct
from Microsoft. This scheme would support dependencies on new runtime lib
versions not yet conceived when the user's version of Python was installed.


Cheers,

- Alf
 
S

Stephen Hansen

Nonsense. They have released VS2010, but they certainly have not
"withdrawn" VS2008, and I have heard of no plans to do so.

Its not nonsense; Microsoft has historically made unavailable fairly
quickly previous versions of the suite after a new release is out. There
hasn't been any serious notification of this before it happens.

The concern here is not at all without precedent. There has been some
very real pain for Python extension authors/maintainers directly related
to what compilers and SDK's Microsoft makes available: generally, Python
is 'behind' the times of what's the latest version of VS and their SDK
that is available.
Also nonsense. Get it from right here:
http://www.microsoft.com/express/downloads/

Note the three tabs: VS2010, SQL Server R2, and VS2008.

Again, not nonsense.

That's available now. However, very real experience has made certain
people *very* reasonably cautious about when "now" becomes "the past" in
this situation: what is available now may change as soon as tomorrow or
later with very little real notice.

Yeah, you can get a MSDN subscription and get access to a lot. Lots of
people can't afford that just to compile an extension they support.

--

Stephen Hansen
... Also: Ixokai
... Mail: me+list/python (AT) ixokai (DOT) io
... Blog: http://meh.ixokai.io/


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.10 (Darwin)

iQEcBAEBAgAGBQJMNCLWAAoJEKcbwptVWx/l+aQH/R6GhXY7e+/kxflcIpsohzTL
CK9vNKUSG6mqfbyQPryhfRZEGKrCW7idnOoHGCqXB6368DOUdUHA1/MvwwJOr+6s
N/TRb4oM+d39s/oiBbDielEjozOscGwQaVOXytkVmBaU6cYHVZuW9wNVHRwt/Qhx
jDSRXLQNH8LrpssuDrgYdNevKxTO8rb/zEgMWi/jfAFzlIbZw5AQX+l8YU7lNmk5
BXus3AbiEnZmmPPqxH7YET/VSM8ofBF5su1fm5pinlNH2pTbugfBBsqc3EdTaRH8
7XzBy/g9xBsbjPL1LhLK/53eoI/w2sph4TfDO31/AcO8v9oGdRBr54JJ/y/Bvuc=
=IA28
-----END PGP SIGNATURE-----
 
J

Jonathan Hartley

Just a little reminder:

Microsoft has withdrawn VS2008 in favor of VS2010. The express version
is also unavailable for download. :mad:(

We can still get a VC++ 2008 compiler required to build extensions for
the official Python 2.6 and 2.7 binary installers here (Windows 7 SDK
for .NET 3.5 SP1):

http://www.microsoft.com/downloads/details.aspx?familyid=71DEB800-C59....

Download today, before it goes away!

Microsoft has now published a download for Windows 7 SDK for .NET 4.
It has the VC++ 2010 compiler. It can be a matter of days before the VC
++ 2008 compiler is totally unavailable.


I presume this problem would go away if future versions of Python
itself were compiled on Windows with something like MinGW gcc. Also,
this would solve the pain of Python developers attempting to
redistribute py2exe versions of their programs (i.e. they have to own
a Visual Studio license to legally be able to redistribute the
required C runtime) I don't understand enough to know why Visual
Studio was chosen instead of MinGW. Can anyone shed any light on that
decision?

Many thanks

Jonathan Hartley
 
S

sturlamolden

Also,
this would solve the pain of Python developers attempting to
redistribute py2exe versions of their programs (i.e. they have to own
a Visual Studio license to legally be able to redistribute the
required C runtime)

http://www.microsoft.com/downloads/...34-3e03-4391-8a4d-074b9f2bc1bf&displaylang=en

If this is not sufficient, ask Microsoft for permission or buy a copy
of Visual Studio (any will do, you can rebuild Python).

I don't understand enough to know why Visual
Studio was chosen instead of MinGW. Can anyone shed any light on that
decision?

It the standard C and C++ compiler on Windows.
 
M

Martin v. Loewis

Python 3.1.1, file [pymem.h]:
PyAPI_FUNC(void *) PyMem_Malloc(size_t);

#define PyMem_MALLOC(n) (((n) < 0 || (n) > PY_SSIZE_T_MAX) ? NULL \
: malloc((n) ? (n) : 1))

The problem with the latter that it seems that it's intended for safety
but does the opposite...

Why do you say that? It certainly *does* achieve safety, wrt. to certain
errors, specifically:
- passing sizes that are out-of-range
- supporting malloc(0) on all systems

Perhaps (if it isn't intentional) this is a bug of the oversight type,
that nobody remembered to update the macro?

Update in what way?
Except for the problems with file descriptors I think a practical
interim solution for extensions implemented in C could be to just link
the runtime lib statically. For a minimal extension this increased the
size from 8 KiB to 49 KiB. And generally with MS tools the size is
acceptably small.

If you think that's fine for your extension module (which may well be
the case), go ahead. But then, you could also just link with a different
DLL version of the CRT instead.
I think that this would be safe because since the C API has to access
things in the interpreter I think it's a given that all the relevant
functions delegate to shared library (DLL) implementations, but I have
not checked the source code.

There are certainly more cases than the ones mentioned so far, in
particular the time zone and the locale. The CRT carries global
variables for these, so if you set them in the copy of the CRT that
Python links with, you won't see the change in your extension module -
which may or may not be a problem.
As a more longterm solution, perhaps python.org could make available the
redistributables for various MSVC versions, and then one could introduce
some scheme for indicating the runtime lib dependencies of any given
extension.

My preferred long-term solution is to reduce the usage of the C library
in CPython as much as reasonable, atleast on Windows. Memory management
could directly use the heap functions (or even more directly
VirtualAlloc); filenos could be OS handles, and so on. There are
probably limitations to what you can achieve, but I think it's worth trying.

Regards,
Martin
 
S

sturlamolden

PyAPI_FUNC(void *) PyMem_Malloc(size_t);

#define PyMem_MALLOC(n)         (((n) < 0 || (n) > PY_SSIZE_T_MAX) ? NULL \
                                : malloc((n) ? (n) : 1))

I was afraid of that :(


Except for the problems with file descriptors I think a practical interim
solution for extensions implemented in C could be to just link the runtime lib
statically.

You still have two CRTs linked into the same process.
 
M

Martin v. Loewis

I presume this problem would go away if future versions of Python
itself were compiled on Windows with something like MinGW gcc. Also,
this would solve the pain of Python developers attempting to
redistribute py2exe versions of their programs (i.e. they have to own
a Visual Studio license to legally be able to redistribute the
required C runtime) I don't understand enough to know why Visual
Studio was chosen instead of MinGW. Can anyone shed any light on that
decision?

sturlamolden has already given the primary reason: Python,
traditionally, attempts to use and work with the system vendor's
compiler. On Windows, that's MSC. It's typically the one that best knows
about platform details that other compilers might be unaware of.

In addition, it's also the compiler and IDE that Windows developers (not
just Python core people, but also extension developers and embedders)
prefer to use, as it has quite good IDE support (in particular debugging
and code browsing).

Perhaps more importantly, none of the other compilers is really an
alternative. GCC in particular cannot build the Win32 extensions, since
it doesn't support the COM and ATL C++ features that they rely on (and
may not support other MSC extensions, either). So the Win32 extensions
must be built with VS, which means Python itself needs to use the same
compiler.

Likewise important: gcc/mingw is *not* a complete C compiler on Windows.
A complete C compiler would have to include a CRT (on Windows); mingw
doesn't (cygwin does, but I think you weren't proposing that Python be
built for cygwin - you can easily get cygwin Python anyway). Instead,
mingw relies on users having a CRT available to
them - and this will be a Microsoft one. So even if gcc was used, we
would have versioning issues with Microsoft CRTs, plus we would have to
rely on target systems including the right CRT, as we couldn't include
it in the distribution.

HTH,
Martin
 
S

sturlamolden

I was afraid of that :(

Also observe that this macro is very badly written (even illegal) C.
Consider what this would do:

PyMem_MALLOC(n++)

According to Linus Thorvalds using macros like this is not even legal
C:

http://www.linuxfocus.org/common/src/January2004_linus.html

This would be ok, and safe as long as we use the GIL:

register Py_ssize_t __pymem_malloc_tmp;
#define PyMem_MALLOC(n)\
(__pymem_malloc_tmp = n, (((__pymem_malloc_tmp) < 0 ||
(__pymem_malloc_tmp) > PY_SSIZE_T_MAX) ? NULL \
                                : malloc((__pymem_malloc_tmp) ?
(__pymem_malloc_tmp) : 1)))


An inline function is a better solution, but not ANSI C standard:

inline void *PyMem_MALLOC(Py_ssize_t n)
{
return (((n) < 0 || (n) > PY_SSIZE_T_MAX) ? NULL
: malloc((n) ? (n) : 1));
}
 
A

Alf P. Steinbach /Usenet

* Martin v. Loewis, on 07.07.2010 21:10:
Python 3.1.1, file [pymem.h]:

PyAPI_FUNC(void *) PyMem_Malloc(size_t);

#define PyMem_MALLOC(n) (((n)< 0 || (n)> PY_SSIZE_T_MAX) ? NULL \
: malloc((n) ? (n) : 1))

The problem with the latter that it seems that it's intended for safety
but does the opposite...

Why do you say that? It certainly *does* achieve safety, wrt. to certain
errors, specifically:
- passing sizes that are out-of-range
- supporting malloc(0) on all systems

It uses malloc instead of PyMem_Malloc. malloc may well be different and use a
different heap in an extension DLL than in the Python interpreter and other
extensions. That's one thing that the docs (rightly) warn you about.

Update in what way?

I was guessing that at one time there was no PyMem_Malloc. And that it was
introduced to fix Windows-specific problems, but inadvertently without updating
the macro. It's just a guess as to reasons why the macro uses malloc directly.

If you think that's fine for your extension module (which may well be
the case), go ahead.

I have no comment on that except pointing it out as a somewhat stupid, somewhat
evil social inclusion/exclusion argument, talking to the audience. Argh. You're
wasting my time. But anyway, 49 KiB is small by today's standards. For example,
you get 20 of those in a single MiB, and about 20.000 in a single GiB.

But then, you could also just link with a different
DLL version of the CRT instead.

When I wrote "link the runtime lib statically" that was an alternative to the
usual link-as-DLL.

It wouldn't make sense to link the runtime lib statically as an alternative to
linking it statically.

As for linking to a different /version/ of the CRT, if you really mean that, I
think that's difficult. It's not necessarily impossible, after all there's
STLPort. But I think that it must at the very least be rather difficult to do
with Microsoft's tools, for otherwise people would have employed that solution
before, and so I wouldn't trust the result, and wouldn't waste the time trying.


Cheers,

- Alf
 
M

Martin v. Loewis

Also observe that this macro is very badly written (even illegal) C.
Consider what this would do:

PyMem_MALLOC(n++)

According to Linus Thorvalds using macros like this is not even legal
C:

http://www.linuxfocus.org/common/src/January2004_linus.html

[Please don't use "legal" wrt. programs - it's not "illegal" to violate
the language's rules; you don't go to jail when doing so. Linus said
"not allowed"]

You are misinterpreting that statement. Linus said that the isdigit
macro was non-conforming, *and meant that specifically for isdigit()*.
That's because the C standard says that isdigit is a function. Under
the as-if rule, you may implement it as a macro as long as nobody can
tell the difference. However, in the presented implementation, there
is a notable difference.

However, the C standard is silent wrt. to PyMem_MALLOC, and it certainly
allows the definition of macros which use the macro arguments more than
once.
This would be ok, and safe as long as we use the GIL:

The macro is ok as it stands (minus the issues with multiple heaps).
The Python convention is that you clearly recognize PyMem_MALLOC as
a macro, so you should know not to pass parameters with side effects.
register Py_ssize_t __pymem_malloc_tmp;
#define PyMem_MALLOC(n)\
(__pymem_malloc_tmp = n, (((__pymem_malloc_tmp) < 0 ||
(__pymem_malloc_tmp) > PY_SSIZE_T_MAX) ? NULL \
: malloc((__pymem_malloc_tmp) ?
(__pymem_malloc_tmp) : 1)))

That would partially defeat the purpose, namely it would require the
compiler to put the size into a variable in memory, and possibly prevent
optimizations from taking place that rely on constant propagation
(depending on how smart the compiler is).

Regards,
Martin
 
S

sturlamolden

That would partially defeat the purpose, namely it would require the
compiler to put the size into a variable in memory, and possibly prevent
optimizations from taking place that rely on constant propagation
(depending on how smart the compiler is).

Also after reading carefully what Linus said, it would still be
incorrect if n is a complex expression. So, an inline function is the
"correct" one here.
 
M

Martin v. Loewis

Perhaps (if it isn't intentional) this is a bug of the oversight type,
I was guessing that at one time there was no PyMem_Malloc. And that it
was introduced to fix Windows-specific problems, but inadvertently
without updating the macro. It's just a guess as to reasons why the
macro uses malloc directly.

It might indeed be that the function version was introduced specifically
for Windows. However, the macro was left intentionally: both for
backwards compatibility, and for use inside Python itself.
[...]

When I wrote "link the runtime lib statically" that was an alternative
to the usual link-as-DLL.

Ok, I lost the thread. When you said "a practical interim solution"
you were talking about what problem? I thought the discussion was
about the need to link with the same DLL version as Python.
It wouldn't make sense to link the runtime lib statically as an
alternative to linking it statically.

However, it would surely make sense to link with a different DLL than
the one that Python links with, assuming that would actually work.
As for linking to a different /version/ of the CRT, if you really mean
that, I think that's difficult. It's not necessarily impossible, after
all there's STLPort. But I think that it must at the very least be
rather difficult to do with Microsoft's tools, for otherwise people
would have employed that solution before, and so I wouldn't trust the
result, and wouldn't waste the time trying.

It's actually straight-forward (or used to be, until they came up with
the SxS madness). It was actually the case that people did so
unexpectingly, and it seemed to work fine, except that it crashed when
passing FILE*. Then we started explaining that mixing CRTs is risky.

Regards,
Martin
 
A

Alf P. Steinbach /Usenet

* sturlamolden, on 07.07.2010 21:46:
CRT resources cannot be shared across CRT borders. That is the
problem. Multiple CRTs are not a problem if CRT resources are never
shared.

Yeah, but then we're down to file descriptors, C library locales and such as the
remaining problems.

Cheers,

- Alf
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,143
Latest member
DewittMill
Top