OT: This Swift thing

S

Sturla Molden

Dear Apple,

Why should I be exited about an illegitmate child of Python, Go and
JavaScript?

Because it has curly brackets, no sane exception handling, and sucks less
than Objective-C?

Because init is spelled without double underscores?

Because it faster than Python? Computers and smart phones are slow these
days. And I guess Swift makes my 3g connection faster.

It's ok to use in iOS apps. That would be it, I guess.


Sturla
 
A

Alain Ketterlin

Sturla Molden said:
Dear Apple,

Why should I be exited about an illegitmate child of Python, Go and
JavaScript?
[...]

Type safety. (And with it comes better performance ---read battery
life--- and better static analysis tools, etc.) LLVM (an Apple-managed
project) for the middle- and back-end, and a brand new front-end
incorporating a decent type system (including optional types for
instance).

Swift's memory management is similar to python's (ref. counting). Which
makes me think that a subset of python with the same type safety would
be an instant success.

-- Alain.
 
C

Chris Angelico

Swift's memory management is similar to python's (ref. counting). Which
makes me think that a subset of python with the same type safety would
be an instant success.

In the same way that function annotations to give type information
were an instant success?

ChrisA
 
W

wxjmfauth

Le jeudi 5 juin 2014 10:14:15 UTC+2, Alain Ketterlin a écrit :
Dear Apple,

Why should I be exited about an illegitmate child of Python, Go and
JavaScript?

[...]



Type safety. (And with it comes better performance ---read battery

life--- and better static analysis tools, etc.) LLVM (an Apple-managed

project) for the middle- and back-end, and a brand new front-end

incorporating a decent type system (including optional types for

instance).



Swift's memory management is similar to python's (ref. counting). Which

makes me think that a subset of python with the same type safety would

be an instant success.

About type. A very quick look at the doc and at what is
interesting me.

"A string [type] is an ordered collection of characters [type], such..."

[]: my addendum

It seems they are understanding unicode (zeroth order approximation).

jmf
 
C

Chris Angelico

If they were useful, they would be used more. I have made several uses
of (a variant of)

http://code.activestate.com/recipes/578528-type-checking-using-python-3x-annotations/

Precisely. I don't see that there's a huge body of coders out there
just itching to use "Python but with some type information", or we'd
be seeing huge amounts of code, well, written in Python with type
information. They've been seen as an interesting curiosity, perhaps,
but not as "hey look, finally Python's massive problem is solved". So
I don't think there's much call for a *new language* on the basis that
it's "Python plus type information".

There's more call for "Python with C-like syntax", given the number of
times people complain about indentation. (There already is such a
language, but it's somewhat obscure, so it's quite likely Apple aren't
aware of its merits.) There might be call for "Python that can be
compiled efficiently to the such-and-such backend". But not "Python
with declared-type variables", not as a feature all of its own.

ChrisA
 
S

Sturla Molden

Type safety.

Perhaps. Python has strong type safety. It is easier to spoof a type in
C or C++ than Python.

Python 3 also has type annotations that can be used to ensure the types
are correct when we run tests. In a world of consenting adults I am not
sure we really need static types compared to ducktyping.

(And with it comes better performance ---read battery
life--- and better static analysis tools, etc.)

Perhaps, perhaps not. My experience is that only a small percentage of
the CPU time is spent in the Python interpreter.

- The GPU does not care if my OpenGL shaders are submitted from Python
or C. Nor do any other library or framework. If I use OpenCV to capture
live video, it could not care less if I use Python or C. A cocoa app
using PyObjC will not use Python to prepare each pixel on the screen.
Even if the screen is frequently updated, the battery is spent somewhere
else than in the Python interpreter.

- A GUI program that is mostly idle spends more battery on lighting the
screen than executing code.

- If I use a 3g connection on my iPad, most of the battery will be spent
transmitting and receiving data on the mobile network.

- Where is the battery spent if I stream live video? In the Python
interpreter that executes a few LOC for each frame? I will make the bold
statement that an equivalent C program would exhaust the battery equally
fast.

- If an web app seems slow, it is hardly every due to Python on the
server side.

- If the response time in a GUI is below the limits of human perception,
can the user tell my Python program is slower than a C program?


For the rare case where I actually have to run algorithmic code in
Python, there is always Numba (an LLVM-based JIT compiler) or Cython
which can be used to speed things up to C performance when the Python
prototype works. I rarely need to do this, though.


LLVM (an Apple-managed
project) for the middle- and back-end, and a brand new front-end
incorporating a decent type system (including optional types for
instance).

Numba uses LLVM.

When I compile Cython modules I use LLVM on this computer.

Swift's memory management is similar to python's (ref. counting). Which
makes me think that a subset of python with the same type safety would
be an instant success.

A Python with static typing would effectively be Cython :)

It is the tool of choice in many scientific Python projects today. Most
projects affiliated with NumPy and SciPy prefer Cython to C or Fortran
for new code.


Sturla
 
M

Michael Torrie

Perhaps, perhaps not. My experience is that only a small percentage of
the CPU time is spent in the Python interpreter.

Depends greatly on the type of application. While it's true that most
apps that aren't CPU bound are idle most of the time, there's more to
the story than that. A handy utility for analyzing power usage by
applications is Intel's powertop. It measures things like how many
wakeups a program caused, and which sleep states a CPU is spending time
in. It's more complicated and nuanced than simply adding up CPU time.

In any case I'm a bit surprised by people comparing Python to Swift at
all, implying that Python would have worked just as well and Apple
should have chosen it to replace Objective C. Why are we comparing an
interpreter with a compiled language? Apple's goal is to produce a
language that they can transition from Objective C to, and use to build
apps as well as core system frameworks. Swift provides a cleaner system
for developers to work in than Obj C did (which, by the way has
reference counting), but carries on the same object model that
developers are used to (and existing frameworks use).
 
C

Chris Angelico

My experience is that only a small percentage of the CPU time is spent in
the Python interpreter.
[various examples ]
- If the response time in a GUI is below the limits of human perception, can
the user tell my Python program is slower than a C program?

And I'd go further: With several of these examples (particularly this
last one), contradictory examples are code smell at the level of the
Mythbusters' Corvette. I've had a few times when a GUI program written
in a high level language is perceptibly slow; the most recent example
was complete proof of your assertion, because under certain
circumstances it could saturate a CPU core - but generally it would be
25% in my code and 75% in Xorg. The bug was that it was redrawing a
ridiculous amount of "stuff" that hadn't changed (and in a lot of
cases wasn't even visible), in response to a sweep of user actions.
(Imagine marking and highlighting text in your favourite editor, and
every time you move the mouse a pixel across, the entire buffer gets
redrawn.) So even when response time was appalling, most of the time
was actually spent inside API calls, not my code. Given how much
easier it is to debug Python code than C code, I'd say this puts the
advantage squarely on the high level language.

ChrisA
 
S

Sturla Molden

In any case I'm a bit surprised by people comparing Python to Swift at
all, implying that Python would have worked just as well and Apple
should have chosen it to replace Objective C.

Because if you look at the spec, Swift is essentially a statically typed
Python.

Swift and Python will also be used in the same niche. C will still be
used for low-level stuff. Swift is not a replacement for C. It's a
replacement for Objective-C.

Swift provides a cleaner system
for developers to work in than Obj C did (which, by the way has
reference counting), but carries on the same object model that
developers are used to (and existing frameworks use).

That is what PyObjC does as well.


Sturla
 
M

Michael Torrie

Because if you look at the spec, Swift is essentially a statically typed
Python.

I guess I'm not following your argument. Are you saying Swift should
adopted Python syntax (similar to the .net language Boo) or are you
saying Apple should have adopted Python instead?
Swift and Python will also be used in the same niche. C will still be
used for low-level stuff. Swift is not a replacement for C. It's a
replacement for Objective-C.

No they won't be used in the same niche. Objective C is certainly not
used in the same niche as Python, so why would Swift? I don't expect to
see any major OS X app written completely in Python, nor would I expect
and of the core frameworks to be written in Python. They will be
written in Swift however.

That is what PyObjC does as well.

Not quite what I mean. As you said yourself, Swift is aiming to replace
ObjC. Thus core system frameworks will slowly be replaced over time with
frameworks written in Swift (binary, compiled frameworks). So you'll be
using PySwift in the future instead of PyObjC, which should be an easy
bridge to create since the object model is not changing.
 
M

Mark H Harris

No they won't be used in the same niche. Objective C is certainly not
used in the same niche as Python, so why would Swift? I don't expect to
see any major OS X app written completely in Python, nor would I expect
and of the core frameworks to be written in Python. They will be
written in Swift however.

OS X apps will indeed be written in Swift; esp if they will be
distributed from the Apple Store--- Python apps are streng verboten in
Apple land.

OTOH, much of my Python work is done on the mac for the mac... just
not distributed from the Apple Store.

OTOH, it only makes sense to code with Apple's tools if the app is
going to be Apple Store ready.

OTOH, I don't view the mac as an "Apple" thing. My mac is a *nix
clone (freeBSD variant) which is a stable platform for Python coding and
debug|test. I won't be using Swift; however, I will be using IDLE.

JFTR, Apple should have adopted Python3, IMHO.


marcus
 
A

Alain Ketterlin

Sturla Molden said:
Perhaps. Python has strong type safety.

Come on.

[...]
Perhaps, perhaps not. My experience is that only a small percentage of
the CPU time is spent in the Python interpreter.
[...]

Basically, you're saying that a major fraction of python programs is
written in another language. An interesting argument...
Numba uses LLVM.

As far as I know, Numba deals only with primitive types. You will gain
nothing for classes. (And Numba is a JIT.)
When I compile Cython modules I use LLVM on this computer.

Cython is not Python, it is another language, with an incompatible
syntax.
A Python with static typing would effectively be Cython :)

I don't think so. The various proposals mentioned elsewhere in this
thread give concrete examples of what static typing would look like in
Python.

-- Alain.
 
C

Chris Angelico

Basically, you're saying that a major fraction of python programs is
written in another language. An interesting argument...

No, a major fraction of Python program execution time is deep inside
code written in another language. In extreme cases, you might have a
tiny bit of Python glue and the bulk of your code is actually, say,
FORTRAN - such as a hefty numpy number crunch - which lets you take
advantage of multiple cores, since there's no Python code running most
of the time.

And that's counting only CPU time. If you count wall time, your
typical Python program spends most of its time deep inside kernel API
calls, waiting for the user or I/O or something.

ChrisA
 
A

Alain Ketterlin

Chris Angelico said:
Precisely. I don't see that there's a huge body of coders out there
just itching to use "Python but with some type information", or we'd
be seeing huge amounts of code, well, written in Python with type
information. They've been seen as an interesting curiosity, perhaps,
but not as "hey look, finally Python's massive problem is solved". So
I don't think there's much call for a *new language* on the basis that
it's "Python plus type information".

I have seen dozens of projects where Python was dismissed because of the
lack of static typing, and the lack of static analysis tools. I'm
supervising our students during their internship periods in various
industrial sectors. Many of these students suggest Python as the
development language (they learned it and liked it), and the suggestion
is (almost) always rejected, in favor of Java or C# or C/C++.

-- Alain.
 
A

Alain Ketterlin

Chris Angelico said:
No, a major fraction of Python program execution time is deep inside
code written in another language. In extreme cases, you might have a
tiny bit of Python glue and the bulk of your code is actually, say,
FORTRAN - such as a hefty numpy number crunch - which lets you take
advantage of multiple cores, since there's no Python code running most
of the time.

This is actually what I meant. I find it sad to keep Python such a glue
language (the kind of language you throw away when the trend
changes---like Perl for example).
And that's counting only CPU time. If you count wall time, your
typical Python program spends most of its time deep inside kernel API
calls, waiting for the user or I/O or something.

But this is true of any IO-bound program, whatever the language. I see
no reason why Python should be restricted to simple processing tasks.

-- Alain.
 
M

Mark Lawrence

I have seen dozens of projects where Python was dismissed because of the
lack of static typing, and the lack of static analysis tools. I'm
supervising our students during their internship periods in various
industrial sectors. Many of these students suggest Python as the
development language (they learned it and liked it), and the suggestion
is (almost) always rejected, in favor of Java or C# or C/C++.

-- Alain.

How many tears are shed as a result of these decisions? Or do they
spend all afternoon at the pub celebrating as the code has compiled,
while the poor, very hard done by Python programmers have to stay behind
and test their code? Let's face it, we all know that for a statically
compiled language the compiler catches all errors, so there's nothing to
worry about.
 
C

Chris Angelico

I don't understand that comment, please explain.

"Type safety" means many different things to different people. What
Python has is untyped variables, and hierarchically typed objects.
It's impossible to accidentally treat an integer as a float, and have
junk data [1]. It's impossible to accidentally call a base class's
method when you ought to have called the overriding method in the
subclass, which is a risk in C++ [2]. If you mistakenly pass a list to
a function that was expecting an integer, that function will *know*
that it got a list, because objects in Python are rigidly typed.

But some languages stipulate the types that a variable can take, and
that's something Python doesn't do. If you want to say that this
function argument must be an integer, you have to explicitly check it
inside the function. (And the Pythonic thing to do is to *not* check
it, but that's a separate point.) This is something that function
annotations can be used for, but I'm not seeing a huge thrust to make
use of them everywhere. Why not? I suspect because the need for it
just isn't as great as some people think.

ChrisA

[1] Note that in some circumstances, you can (deliberately) fiddle
with an object's type. But you can't just reinterpret the bits in
memory, the way you can in C, by casting a pointer and dereferencing
it. Hence, it's impossible to *accidentally* muck this up.
[2] Again, you can muck things up, by explicitly pulling up a function
from the base class, rather than using method lookup on the object.
But again, you can't do it accidentally.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,570
Members
45,045
Latest member
DRCM

Latest Threads

Top