Re: what are the most popular building and packaging tools forpython ??

Discussion in 'Python' started by Neil Benn, Oct 25, 2004.

  1. Neil Benn

    Neil Benn Guest

    Steve wrote:

    ><snip>
    >
    >(the suits are yet to be enlightened about
    >the benefits of OSS).
    >

    <snip>
    Hello,

    A word of warning, if you get a knowledgeable suit (hmm, I
    wear street clothes at work, does that make me a street!!) then they may
    start looking at reverse compilation. This is something it it possible
    to do in most 'bytecode' languages - other bytecode implementations
    (java, .NET) use 'obfuscators' that will make your code unreadable if
    someone tries to decompile it. To this end, I've not seen a python
    obfuscation tool - this is a real shame because it does mean I have to
    release my source code if I'm writing a commercial app. To my mind the
    'make your client sign an agreement' policy doesn't always work,
    especially if I'm selling low value (sub $10K) tools where I don't want
    to go through the bother of waving a legal agreement under somebody's
    nose when I'm selling software (any more than a typical EULA) and I
    certainly wouldn't have the clout to enforce it! This includes the EULA
    as well so you basically need a good licence manager.

    As to why I wouldn't open source my code - well what I'm doing
    hasn't been done before in my industry - if I release source code then
    someone can lift the ideas out of my code or simply bypass the licence
    manager to install multiple clients for the cost of one. This means
    that I either have to write my licence manager in C or bridge out to
    another language. Before you ask - I buy my music!!

    If anyone has heard of an obfuscator for python bytecode - let me
    know please.

    Cheers,

    Neil
    Neil Benn, Oct 25, 2004
    #1
    1. Advertising

  2. Re: what are the most popular building and packaging tools for python ??

    Neil Benn <> wrote:
    ...
    > start looking at reverse compilation. This is something it it possible
    > to do in most 'bytecode' languages - other bytecode implementations
    > (java, .NET) use 'obfuscators' that will make your code unreadable if
    > someone tries to decompile it. To this end, I've not seen a python
    > obfuscation tool


    Security by obscurity isn't. If you can obfuscate, I can deobfuscate,
    if it's worth my while. If you stick in license checking, I can patch
    it out. It's not about one programmer being better than another: the
    attacker/cracker has the inevitable advantage. If you ship all the code
    (even in object obfuscated form) you're toast. I know: I've done that
    as part of my job for ten years of my life -- copy protection and the
    like WAS one part of my job as senior software consultant. Thousands of
    hours wasted off my live. Quoth the raven, nevermore.

    If your code contains really valuable trade secrets, my well-considered,
    experience-driven, professional suggestion, is to carefully remove just
    enough of the secret parts from the code you distribute, and make them
    available only as web-services or the equivalent from a host you
    control. Whatever implementation language you use, the only code that
    will never be cracked is code that does NOT leave your control. (well,
    that AND most code that's not really all that valuable, of course;-).

    A web service can require any form of authentication and validation from
    its 'client' code, so you can implement any business model you like. I
    heartily recommend (depending on the situation) subscription-based or
    per-use fees, over the old and crufty 'sell the bits' model that never
    really worked right (IMHO). Be sure to pick _important_ parts of your
    code as those that are only available as webservices, not to make the
    webservices just a kind of license check, or else the webservice access
    WILL be hacked out just like any other license check (assuming your code
    IS valuable, of course).

    You can distribute, depending on your exact circumstances, an "already
    somewhat useful" pure-client program, as long as the full functionality
    that customers will pay for is only accessible via the webservices. You
    can even opensource the part you distribute, that may garner you useful
    feedback, more customers, etc.

    Of course, there _are_ still, today, applications which can't assume the
    net is available and must still offer full functionality no matter what.
    They're fewer and fewer, thanks be, as connectivity spreads -- games
    accrue multiplayer online-play features that players are eager for,
    financial programs require access to updated exchange rates or stock
    levels, and so on. If you do need to sell applications which have full
    functionality without net access, you may as well resign yourself: you
    will never be truly safe, alas.


    Alex
    Alex Martelli, Oct 25, 2004
    #2
    1. Advertising

  3. Re: what are the most popular building and packaging tools for python ??

    On Mon, 25 Oct 2004 23:13:35 +0200, (Alex Martelli) wrote:
    >Neil Benn <> wrote:
    > ...
    >> start looking at reverse compilation. This is something it it possible
    >> to do in most 'bytecode' languages - other bytecode implementations
    >> (java, .NET) use 'obfuscators' that will make your code unreadable if
    >> someone tries to decompile it. To this end, I've not seen a python
    >> obfuscation tool

    >
    >Security by obscurity isn't. If you can obfuscate, I can deobfuscate,
    >if it's worth my while. If you stick in license checking, I can patch
    >it out. It's not about one programmer being better than another: the
    >attacker/cracker has the inevitable advantage. If you ship all the code
    >(even in object obfuscated form) you're toast. I know: I've done that
    >as part of my job for ten years of my life -- copy protection and the
    >like WAS one part of my job as senior software consultant. Thousands of
    >hours wasted off my live. Quoth the raven, nevermore.
    >
    >If your code contains really valuable trade secrets, my well-considered,
    >experience-driven, professional suggestion, is to carefully remove just
    >enough of the secret parts from the code you distribute, and make them
    >available only as web-services or the equivalent from a host you
    >control. Whatever implementation language you use, the only code that
    >will never be cracked is code that does NOT leave your control. (well,
    >that AND most code that's not really all that valuable, of course;-).
    >

    It's an interesting problem. Personally, I like open source, but I think
    secret stuff should be possible, and I think it will be one of these days...

    To anthropomorphize, code is a recipe for CPU behaviour analogous to the
    way a cooking recipe is a recipe for cook behavior. If I send you a pgp-encrypted
    (with your public key) recipe for a special dish, and you decrypt it in a secure
    kitchen and do the cooking, and only serve results from the kitchen through a special
    window for the waiters to take it to the end user, it would seem that the recipe
    was secure, except for reverse engineering the product itself, and/or inferring what
    may be inferred from externally observable parameters, such as overall timing, etc.

    By analogy, I don't think it's a stretch to imagine a CPU core with a "secure kitchen"
    in the form of an inaccessible instruction cache memory where decrypted (by the core
    hardware) code/recipes may be stored for execution, and with private temporary data cooking
    areas for use before results come out the window.

    If a multi-core chip had a core dedicated as a "secure kitchen," I think it could appear
    software-wise via a proxy dll that had the harware level protocol for passing in encrypted
    code and then using it like a closed pure function, passing arguments and receiving results.

    IOW, a python extension could nicely encapsulate access to securely encrypted code,
    which it would get into the "secure kitchen" by using harware core-core communication
    channels designed to support all this.

    Using this kind of system, a customer would give you his CPU's public key and serial number,
    and you would send him the encrypted code as part of your app package. No other CPU would
    be able to run this code, since no other CPU would have the matching private key
    to decrypt it. Yes, someone could send a bogus non-CPU public key that he has the private
    key for, so there would have to be an authentication of CPU public keys in some kind of
    registry, presumably generated by CPU manufacturer, mapping chip serial number to its
    *public* key, available for checking at some authenticable site.

    Maybe I am missing something, but this seems feasible to me.
    One might feel distrustful of supplying a "kitchen" in one's house
    for someone else to cook secretly in (poison? etc?), but having it encapsulated
    in a core only accessed via argument passing and result retrieval through
    actual hardware channels makes the secret code something like a pure function.
    There can't be any side effects outside of the kitchen, except insofar as the
    open side may be misled by data results.

    I just hope this obvious overall concept has not been locked into
    some patents for use in restricting freedom to create and share code
    according to whatever arrangements people want to agree on to do that.

    Instead, I would hope it would enhance the possibilites for making dependable
    agreements, which should be good for everyone, so long as access to the
    the secure kitchen core hardware functionality is open, and optional,
    and does not have back doors.

    (BTW, cores that specialize in audio or video streams are kind of obvious too).

    >A web service can require any form of authentication and validation from
    >its 'client' code, so you can implement any business model you like. I
    >heartily recommend (depending on the situation) subscription-based or
    >per-use fees, over the old and crufty 'sell the bits' model that never
    >really worked right (IMHO). Be sure to pick _important_ parts of your
    >code as those that are only available as webservices, not to make the
    >webservices just a kind of license check, or else the webservice access
    >WILL be hacked out just like any other license check (assuming your code
    >IS valuable, of course).

    It seems this is where we are, but give it another few years, and other
    possibilites should become available. Or I'm dreaming ;-)

    >
    >You can distribute, depending on your exact circumstances, an "already
    >somewhat useful" pure-client program, as long as the full functionality
    >that customers will pay for is only accessible via the webservices. You
    >can even opensource the part you distribute, that may garner you useful
    >feedback, more customers, etc.
    >
    >Of course, there _are_ still, today, applications which can't assume the
    >net is available and must still offer full functionality no matter what.
    >They're fewer and fewer, thanks be, as connectivity spreads -- games
    >accrue multiplayer online-play features that players are eager for,
    >financial programs require access to updated exchange rates or stock
    >levels, and so on. If you do need to sell applications which have full
    >functionality without net access, you may as well resign yourself: you
    >will never be truly safe, alas.
    >

    Yes, but if it takes sophisticated chip-destroying methods to retrieve a key,
    that wouldn't happen unless it was a blockbuster application with a market worth
    *a lot*. And the pirate would either have to distribute nonencrypted or set up
    his own competing offer to encrypt his clear copy for specific CPUs, which
    would seem pretty vulnerable legally. I.e., anyone trying to make money with
    the pirated stuff would be an easy target for the lawyers, IWT. And how many
    would destroy chips and have the equipment to extract a hidden key, and then
    give away the decrypted result?

    Regards,
    Bengt Richter
    Bengt Richter, Oct 26, 2004
    #3
  4. Neil Benn

    kosh Guest

    On Monday 25 October 2004 5:44 pm, Bengt Richter wrote:

    > It's an interesting problem. Personally, I like open source, but I think
    > secret stuff should be possible, and I think it will be one of these
    > days...
    >


    I have to admit I hope it is not ever really possible.


    > By analogy, I don't think it's a stretch to imagine a CPU core with a
    > "secure kitchen" in the form of an inaccessible instruction cache memory
    > where decrypted (by the core hardware) code/recipes may be stored for
    > execution, and with private temporary data cooking areas for use before
    > results come out the window.


    I certainly hope that does not happen and will continue to advise customers
    not to touch things that even try to do stuff like that. I have seen too many
    people burned badly when some proprietary app they had which required some
    special key server just stopped working. The company would go out of
    business, decide that they did not want people to use the old version any
    more etc etc. Either way the ammount of lockin with a system like that is
    staggering. Even if the software did not cost any money the price is far too
    high for what you lose. However the software tends to be very expensive which
    makes it an even worse investment. The worst ones are those you can't really
    get the data back out of.

    > IOW, a python extension could nicely encapsulate access to securely
    > encrypted code, which it would get into the "secure kitchen" by using
    > harware core-core communication channels designed to support all this.
    >


    I would hope something like this never makes it into python just so that
    people could not use it. I would want the most insane hoops to exist or
    people to even try this and I would want the os to give huge warnings
    whenever any piece of software used that feature.

    > Using this kind of system, a customer would give you his CPU's public key
    > and serial number, and you would send him the encrypted code as part of
    > your app package. No other CPU would be able to run this code, since no
    > other CPU would have the matching private key to decrypt it. Yes, someone
    > could send a bogus non-CPU public key that he has the private key for, so
    > there would have to be an authentication of CPU public keys in some kind of
    > registry, presumably generated by CPU manufacturer, mapping chip serial
    > number to its *public* key, available for checking at some authenticable
    > site.


    So what you want to do is run abitrary code on someone else's machine that
    they have no way to access and that if for some reason you go out of
    business they are screwed. Yeah a system like that is great except when you
    get a new computer and try to install the software on it and find it doesn't
    work since the key no longer matches and the company that made the software
    no longer exists. This is just a horrible idea that should not go into any
    systems.

    >
    > Maybe I am missing something, but this seems feasible to me.
    > One might feel distrustful of supplying a "kitchen" in one's house
    > for someone else to cook secretly in (poison? etc?), but having it
    > encapsulated in a core only accessed via argument passing and result
    > retrieval through actual hardware channels makes the secret code something
    > like a pure function. There can't be any side effects outside of the
    > kitchen, except insofar as the open side may be misled by data results.
    >


    There is no way I would trust a system like that and long term any company
    that does will end up paying a very heavy price. Those that use those methods
    will end up going out of business anyways since a competitor even a twice the
    price will often be able to show themselves to be a far more attractive
    vendor. Being able to tell your customers that if you vanish the software as
    is will work until they switch to something else, that they can get their
    data out, that they can run it on a newer computer etc. All this lock in, drm
    etc stuff is just going to burn a lot of people and it is a bad idea. It is
    trying to create an equivalent to physical controls in a world which is not
    physical and it just won't work.



    > Instead, I would hope it would enhance the possibilites for making
    > dependable agreements, which should be good for everyone, so long as access
    > to the the secure kitchen core hardware functionality is open, and
    > optional, and does not have back doors.
    >


    These agreements belong in the legal system not in the technology. That is why
    you have contracts and you have penalties for breaching a contract.
    kosh, Oct 26, 2004
    #4
  5. Neil Benn

    Neil Benn Guest

    Re: what are the most popular building and packaging toolsfor python ??

    Alex Martelli wrote:

    > <snip>
    >
    >Of course, there _are_ still, today, applications which can't assume the
    >net is available and must still offer full functionality no matter what.
    >They're fewer and fewer, thanks be, as connectivity spreads -- games
    >accrue multiplayer online-play features that players are eager for,
    >financial programs require access to updated exchange rates or stock
    >levels, and so on. If you do need to sell applications which have full
    >functionality without net access, you may as well resign yourself: you
    >will never be truly safe, alas.
    >
    >
    >

    <snip>
    Hello,

    Yes of course you're completely correct - you can't make
    anything totally secure - I'm talking of a handheld app so I can't
    assume connectivity always. However I can make it more difficult than
    it needs to be to make illegal copies of the kit, consider this example :

    ==

    No Obfuscation or Licence manager

    Copy it - simply copy it onto a new handheld
    Source code - decompile it

    No Obfuscation and Licence Manager

    Copy it - get help to break the licence manager but getting the source
    code off, reading the source code and use that to help discover what
    you're looking for
    Source code - decompile it

    With Obfuscation and Licence manager

    Copy it - discover how the licence manager works which will normally
    have three levels of protection - Secrecy, mathematical managling and an
    identification code
    Source code - copy off the byte code, decompile it get a function that
    uses /a/ for a class /aa /for another method /aaaaaaaaaaa /for another
    variable, VoteGeorgeBushOff for an interface (did you liek the
    subliminal message!!). Yes you can deobfuscate that by working out what
    went on but its not gonna be easy and you may as well not bother.

    ==

    So while I certainly agree that its impossible to make code safe,
    the same way its impossible to stop a determined person kicking my front
    door down. It is better to put obstacles in the way to increase the
    pain barrier to stopping it. Cracking XP/Doom3/The Sims is done all the
    time - this isn't a 10K unit sale (if it becomes that I'll host a python
    party in the carribean!!) more like a <100 unit sale.

    In general, you could also get the handheld if connected to
    T'Internet (British people may get that!!) to dial home with it's serial
    number every so often just so I can check that it's still licenced
    correctly.

    Once again, I know this isn't fool proof but to mean I'm comparing
    it against house security - not putting a lock on the front door is
    madness; then you need to work out how dangerous your area is and put
    appropiate security around it but it's never gonna be unbreakable!!!

    Cheers,

    Neil

    --

    Neil Benn
    Senior Automation Engineer
    Cenix BioScience
    BioInnovations Zentrum
    Tatzberg 47
    D-01307
    Dresden
    Germany

    Tel : +49 (0)351 4173 154
    e-mail :
    Cenix Website : http://www.cenix-bioscience.com
    Neil Benn, Oct 26, 2004
    #5
  6. Neil Benn

    Andrew Dalke Guest

    Re: what are the most popular building and packaging tools for python??

    Bengt Richter wrote:
    > Using this kind of system, a customer would give you his CPU's public key and serial number,
    > and you would send him the encrypted code as part of your app package. No other CPU would
    > be able to run this code, since no other CPU would have the matching private key
    > to decrypt it. Yes, someone could send a bogus non-CPU public key that he has the private
    > key for, so there would have to be an authentication of CPU public keys in some kind of
    > registry, presumably generated by CPU manufacturer, mapping chip serial number to its
    > *public* key, available for checking at some authenticable site.
    >
    > Maybe I am missing something, but this seems feasible to me.


    When the software starts, how does it know it's getting data from the
    cryto core? It could be running on an emulated processor that supports
    the new verification system. I use it to generate the public key and
    get the code from you. I then save the state of the emulated
    machine and send a copy of the image and your software's key to all
    my buddies. Poof, they can use it, albeit slowly.

    Your solution to that is a global registry. If you only check it
    once, when the license key is generated, then all it takes for the
    emulator to work is for someone, somewhere in the world to reverse
    engineer the key for a given CPU. And experience has shown that
    it's very hard to design tamper-proof crypto hardware. There are
    attacks like looking at the power draw on the circuitry to estimate
    where in the process the verification failed. See comp.risks for
    many more.

    Even if it isn't emulated, how does the software connect to the
    core? Can I put a shim between the software and the OS that
    intercepts the crypto calls and replaces them with my own calls?

    (Some examples of this in real life include malloc debuggers,
    OpenGL debuggers, and system call tracers.)

    If so, I can register the software with the public key generated
    from my shim, which acts like a real machine. Or I can record the
    messages and attempt a replay attack. If your software generates
    a nonce to get around the attack, well, how does it generate the
    nonce? Can I make it so it always uses the same value?

    If I can't insert a shim, I can still write software to scan
    your program as a text file and look for where it makes the
    system calls, then rewrite the binary to make calls to something
    I can intercept.

    It would take some work, yes, but it only needs to be broken
    once. If it's embedded in common hardware that just means more
    people will attempt to hack it even for fun. For example, look
    at the XBox Linux to see how people have figured out how to
    break that protection system. There's even a book on the topic:
    http://www.hackingthexbox.com/

    Hardware supported copy protection has been tried many times
    over the last few decades. The approaches I outlined above
    are some of the standard ways to break them and unless I
    misunderstood your suggestion, they can easily be applied
    here as well.

    Andrew
    Andrew Dalke, Oct 26, 2004
    #6
  7. Re: what are the most popular building and packaging tools for python ??

    kosh <> wrote:

    > On Monday 25 October 2004 5:44 pm, Bengt Richter wrote:
    >
    > > It's an interesting problem. Personally, I like open source, but I think
    > > secret stuff should be possible, and I think it will be one of these

    >
    > I have to admit I hope it is not ever really possible.


    But it _IS_, just keep it on your own host. Bengt's suggestion is
    technically interesting, but with networking becoming more and more
    pervasive "real secrecy", now "really possible" for applications that
    cover maybe 80% of the software market (though with some limitations,
    e.g. you're selling an app that can't fully run on a laptop on a flying
    plane), will become possible for a wider and wider range and with fewer
    and fewer limitations (e.g., wifi on planes is gonna be a standard
    feature soon, and as it costs so little for airlines to provide it I
    predict it will spread pretty fast).


    > I certainly hope that does not happen and will continue to advise customers
    > not to touch things that even try to do stuff like that. I have seen too many
    > people burned badly when some proprietary app they had which required some
    > special key server just stopped working. The company would go out of
    > business, decide that they did not want people to use the old version any
    > more etc etc. Either way the ammount of lockin with a system like that is


    This is certainly a very valid business consideration. Without (at
    least) source in escrow, or the like, betting your business on ANY
    closed-source app may be very unwise.

    Nevertheless, people and businesses use (e.g.) google daily, and depend
    on it -- they don't have google's sources, and if they did they'd be
    little use since the real value in this case is in the huge database
    they're acessing. Why should any other webservice be different?


    > staggering. Even if the software did not cost any money the price is far too
    > high for what you lose. However the software tends to be very expensive which
    > makes it an even worse investment. The worst ones are those you can't really
    > get the data back out of.


    And _that_ is a customer protection law I'd love to see: requiring all
    applications using proprietary formats for user data to provide an
    export functionality (possibly as an auxiliary program) that is able to
    write out the data as XML according to a DTD (or Schema, or RelaxNG,
    whatever) to be published and filed in some accessible archive, and
    import such an XML file back into the app's own format.

    Some tolerance for closed source is one thing, hijacking users' data is
    the bit that REALLY makes me see red!-)


    > > Instead, I would hope it would enhance the possibilites for making
    > > dependable agreements, which should be good for everyone, so long as access
    > > to the the secure kitchen core hardware functionality is open, and
    > > optional, and does not have back doors.

    >
    > These agreements belong in the legal system not in the technology. That is why
    > you have contracts and you have penalties for breaching a contract.


    It is not reasonable to insist that, since laws exist to protect you,
    you cannot take further precautions to guard against scofflaw, where
    technically reasonable. It's normal, for example, to place a lock on
    some sort of property, even though the legal system may also forbid
    others from removing said property of yours.

    The cable/networking supplier who serves this apartment building (a few
    tenants have bought various kinds of cable access from them -- TV, video
    on demand, telephony, 10 mbps internet, etc -- it's a fiber-optic cable,
    so I'm sure they have plenty of spare capacity;-) has (with my
    permission as the building's owner) installed a service cabinet in the
    cellar, with such interesting toys in it as Cisco routers, fiber optic
    modems/multiplexers, etc. As part of my agreement with them, I have
    agreed that their cabinet can be locked (and it generally is, except
    when some of their technicians forgets to re-lock it, which is why I
    know what's in it). If tomorrow they go out of business (unlikely but
    no more so than for an average software supplier), their internet
    connectivity is no doubt going to go down for quite a while until the
    bankruptcy courts sort things out; anybody relying on that connection
    for their business is going to be scrambling pretty bad to find
    alternate suppliers, _and_ is going to hurt if they have left important
    data on their IMAP server, etc, etc.

    Yet one may evaluate this as a reasonable risk to take (some do). ((Me,
    I'm still using ADSL, waiting for more maturity in those 10 mpbs lines
    before I trust them; in any case I always have an ISDN modem ready to
    take over, so my connectivity _should_ degrade smoothly rather than ever
    get cut off abrubtly... in theory...)). It does not appear to me that
    trusting a specific web service for your business presents any
    substantial difference from trusting a connectivity supplier in general,
    and the 'locking' aspects of the service (as long as no hijacking of
    user data is involved) parallel the locking of the connectivity
    supplie's service cabinet -- an extra precaution in addition to laws and
    contracts, just in case.

    And Bengt's idea (apart from requiring a highly specialized CPU in lieu
    of pretty generic network access) doesn't appear to be any more
    objectionable than web services, either.


    Alex
    Alex Martelli, Oct 26, 2004
    #7
  8. Neil Benn

    kosh Guest

    Re: what are the most popular building and packaging toolsfor python ??

    On Tuesday 26 October 2004 3:07 am, Alex Martelli wrote:

    >
    > This is certainly a very valid business consideration. Without (at
    > least) source in escrow, or the like, betting your business on ANY
    > closed-source app may be very unwise.
    >
    > Nevertheless, people and businesses use (e.g.) google daily, and depend
    > on it -- they don't have google's sources, and if they did they'd be
    > little use since the real value in this case is in the huge database
    > they're acessing. Why should any other webservice be different?
    >


    One of the thing that makes those kinds of webservices different is that you
    can easily leave. Yes google is currently the best but there are lots of
    other search engines that are good enough and would get better pretty rapidly
    if google went nuts, vanished etc. That is part of the degree of protection
    since the person is not really locked in google has to behave or lose them.


    > > staggering. Even if the software did not cost any money the price is far
    > > too high for what you lose. However the software tends to be very
    > > expensive which makes it an even worse investment. The worst ones are
    > > those you can't really get the data back out of.

    >
    > And _that_ is a customer protection law I'd love to see: requiring all
    > applications using proprietary formats for user data to provide an
    > export functionality (possibly as an auxiliary program) that is able to
    > write out the data as XML according to a DTD (or Schema, or RelaxNG,
    > whatever) to be published and filed in some accessible archive, and
    > import such an XML file back into the app's own format.
    >
    > Some tolerance for closed source is one thing, hijacking users' data is
    > the bit that REALLY makes me see red!-)
    >


    There is no doubt this should exist I have just seen a number of cases where
    it does not and companies get screwed pretty badly.

    >
    > And Bengt's idea (apart from requiring a highly specialized CPU in lieu
    > of pretty generic network access) doesn't appear to be any more
    > objectionable than web services, either.
    >
    >


    The webservice one is much less of a burden on the system. You can upgrade
    your computer, use it from a different computer, use it through a proxy etc.
    The application running on your machine tied to some id on your cpu you can't
    do that with. Many web services even work with a wide range of browsers so
    you are not tied into any given os, browser, hardware platform etc.

    Yes your data can still be locked in a proprietary format kept only on the
    server so you never see it except to see reports on that data and I suspect
    that does happen but many services make it very easy to get all of your data
    out of the system.

    So while the webservices can do some of the negative things the crypto cpu
    would it also automatically makes many things far better. If my computer
    takes a lightning strike I can use another computer to use the webservice
    right away. With the crypto app you are pretty much screwed until you go
    through the whole authorization song and dance.
    kosh, Oct 26, 2004
    #8
  9. Re: what are the most popular building and packaging tools for python ??

    kosh <> wrote:
    ...
    > One of the thing that makes those kinds of webservices different is that you
    > can easily leave. Yes google is currently the best but there are lots of
    > other search engines that are good enough and would get better pretty rapidly
    > if google went nuts, vanished etc. That is part of the degree of protection
    > since the person is not really locked in google has to behave or lose them.


    Right, and exactly the same applies to any closed-source app that
    doesn't hijack your data. Consider Google Mail: are you going to
    regularly download copies of everything to be safe? Possibly, then you
    have the exact equivalent of an application which is closed-source but
    has an 'export' function (as they all SHOULD have).


    > > Some tolerance for closed source is one thing, hijacking users' data is
    > > the bit that REALLY makes me see red!-)

    >
    > There is no doubt this should exist I have just seen a number of cases where
    > it does not and companies get screwed pretty badly.


    You're preaching to the choir on this point. Let's debate closed-source
    apps _with_ a decent import/export function to standard or anyway easily
    accessible data formats.


    > > And Bengt's idea (apart from requiring a highly specialized CPU in lieu
    > > of pretty generic network access) doesn't appear to be any more
    > > objectionable than web services, either.

    >
    > The webservice one is much less of a burden on the system. You can upgrade


    Sure, it runs on another system. But it's more of a burden on your
    network, which roughly may even things out.

    > your computer, use it from a different computer, use it through a proxy etc.
    > The application running on your machine tied to some id on your cpu you can't
    > do that with. Many web services even work with a wide range of browsers so
    > you are not tied into any given os, browser, hardware platform etc.


    I wish more web services did support a wide variety of clients (not just
    browsers, of course), but, sure. But the point is a webservice which
    does require you to authenticate to exploit it fully (as google does:
    the google API requires you to register w/them, they do NOT want you to
    just screenscrape from them even though they can't fully stop you...
    yet). If your machine's ID changes, a supplier will be happy to sell
    you some "hardware upgrade" service w/a new license (we did that
    routinely at my previous employer, even though the ID was just something
    we synthesized ourselves, quite crackably); just as a commercial
    supplier of a webservice requiring authentication will be happy to sell
    you more bandwidth, more diskspace use on their servers, and so on.

    The difference in business models is nowhere as drastic as you make it,
    in other words.

    > Yes your data can still be locked in a proprietary format kept only on the
    > server so you never see it except to see reports on that data and I suspect
    > that does happen but many services make it very easy to get all of your data
    > out of the system.


    So do many proprietary closed source applications, though (just like for
    webservices) definitely not all of them. Data hijacking is a completely
    separate issue, and, as I said, one of my hobbyhorses anyway, so let's
    leave it out of a debate where it doesn't really enter.


    > So while the webservices can do some of the negative things the crypto cpu
    > would it also automatically makes many things far better. If my computer


    Nothing automatic about it.

    > takes a lightning strike I can use another computer to use the webservice
    > right away. With the crypto app you are pretty much screwed until you go
    > through the whole authorization song and dance.


    So does the webservice, if it uses (e.g.) IP address as part of your
    authentication (quite a few do, to avoid the obvious piracy idea of
    paying for 1 computer using it then use it from five -- of course, these
    days, with NATs and dynamic IPs, it's more likely that there will be
    involved such things as certificate files... which will go if your
    computer takes a lightning strike...:).


    Alex
    Alex Martelli, Oct 26, 2004
    #9
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. mike
    Replies:
    4
    Views:
    11,121
    Rob McAninch
    Dec 31, 2005
  2. Ron Stephens
    Replies:
    23
    Views:
    2,795
    Ron Stephens
    Apr 12, 2004
  3. Steve
    Replies:
    0
    Views:
    367
    Steve
    Oct 25, 2004
  4. tom c
    Replies:
    22
    Views:
    766
    Jon Paal
    Sep 13, 2006
  5. Deep_Feelings
    Replies:
    6
    Views:
    237
Loading...

Share This Page