copy protection / IP protection

B

Bent C Dalager

I'm sorry - I cannot imagine how this would work.

Now something I can imagine is a signing authority for software which
ensures that it is safe.

Which, presumably, is how this will work :)

Except it doesn't ensure that a piece of software is "safe", but that
it is "trusted".
All DRM software has to be digitally signed by a
trusted source, OR it can be non DRM and be virus scanned.

I believe this confuses the issue somewhat.

There will not be much need for DRM software as such in a trusted
hardware chain. In stead, pieces of software are either trusted or
they are not. Non-trusted software will not be given access to data
marked as protected, while trusted software will.

A properly signed virus scanner will be trusted and so will be given
access to the data it needs to scan applications for viruses.

On Windows, it is likely that decisions of trust are mainly handled by
the OS (with wary hardware looking over its shoulder) and that trust
certificates etc. are organized and distributed in cooperation with
Microsoft.

Cheers
Bent D
 
L

Luc The Perverse

Bent C Dalager said:
A properly signed virus scanner will be trusted and so will be given
access to the data it needs to scan applications for viruses.

I'm sorry but I guess I have to disagree with you. Trusted software cannot
see each other, or else you will eventually have a corrupt application get
signed and extract runnable code from all products using the DRM. I don't
think the OS will be able to see or interact directly with the program
either. In exchange though, the software won't be able to directly
interact with the OS either (in the same way java byte code interacts with
the JVM), except that in this case the virtual machine is actually a
separate core on the processor.

The program wouldn't have to be deliberately corrupt, let's say that there
are 5000 encrypted DRM programs out there and one of them was owned by a
crappy garage company which goes out of business. As their dying act they
make all of their code open source and publish it - but the original signed
encrypted copies still exist and can be purchased on eBay - so some hackers
buy it, and use the open source code to gain access to the black box in the
CPU by exploiting security holes in the product. Then because the CPU will
give up everything as soon as the certificates match, they can purchase
every DRM protected piece of code out there and extracted the code, and
begin selling it illegally. Once you have the disassembly after all, it is
pretty easy to make hacks - it happens all the time. So I think, by
example, allowing signed authorities to see each other's data doesn't make
much sense.

I feel the same way about online transactions. I purchase from newegg and
they are signed by a believed secure company called verisign. However, I
would be seriously pissed if I found out that every company that has a
certificate with verisign could see my credit card and other info. It's the
same scenario - just in reverse. Companies will not pay to have their
software protected if it isn't going to be protected at all.
 
B

Bent C Dalager

I'm sorry but I guess I have to disagree with you. Trusted software cannot
see each other, or else you will eventually have a corrupt application get
signed and extract runnable code from all products using the DRM.

You will always have this risk. Trusted software can be buggy, as can
trusted hardware. The main risk is that some vendors (both software
and hardware) will deliberately build in back doors, as we see with
region free DVD players etc.

Anyway, chances are there will be several levels of trust that a piece
of software can have. They can go completely wild with this, of
course, but it's a safe bet that a music player will need much lower
level of trust than what a virus scanner will. The buggy music player
may only let the user copy music (perhaps) while a buggy virus scanner
can be a lot more problematic.

The level of trust required by the virus scanner makes it probable
that this functionality will primarily be part of the OS anyway.
The program wouldn't have to be deliberately corrupt, let's say that there
are 5000 encrypted DRM programs out there and one of them was owned by a
crappy garage company which goes out of business. As their dying act they
make all of their code open source and publish it - but the original signed
encrypted copies still exist and can be purchased on eBay - so some hackers
buy it, and use the open source code to gain access to the black box in the
CPU by exploiting security holes in the product. Then because the CPU will
give up everything as soon as the certificates match, they can purchase
every DRM protected piece of code out there and extracted the code, and
begin selling it illegally. Once you have the disassembly after all, it is
pretty easy to make hacks - it happens all the time. So I think, by
example, allowing signed authorities to see each other's data doesn't make
much sense.

When this was known to have happened, the certificates of the cracked
software would be revoked, making it problematic for the general
public to use it: Once they find a need to upgrade/patch their
computer (or just connect to the Internet), the OS will download the
revocation list and automatically reject the suspect software.
Legitimate purchasers will get it back in working order again on the
next product update. (Which may be tied to their CPU's certificate or
some other unique, personal identification.)

Warez groups could still use it to make non-protected versions of
older music/movies/software/etc but no new releases could be accessed
with the revoked certificates. Much warezed software would be
particularly useless since it wouldn't have the certificates necessary
to access a lot of media or hardware on an up-to-date system. And
there will be reasons for people to want their systems to be up to
date.

Also, Microsoft would be unlikely to give a crappy garage company the
licenses it needs to access other applications at its whim. Symantec
could get this, but Snakeoil Intl. wouldn't. Vulnerabilities would
tend to be less broad - perhaps affecting a media player or somesuch.
I feel the same way about online transactions. I purchase from newegg and
they are signed by a believed secure company called verisign. However, I
would be seriously pissed if I found out that every company that has a
certificate with verisign could see my credit card and other info. It's the
same scenario - just in reverse. Companies will not pay to have their
software protected if it isn't going to be protected at all.

It will be protected, but that's not really the point of the
software's certificate(s). The certificate is something that gives the
software privileges it otherwise would not have. If it wants to play
protected music, it may have certificates that allow it to do this. In
order to get this, it might have to negotiate with Sony or some
consortium for a certificate to read the media and with Microsoft for
a certificate that gives permission to stream it through the OS to
appropriate outputs.

Alternatively, if both the CD reader and the speakers are trusted and
the player software doesn't need to do anything fancy to the music, it
could just tell the OS to "stream media from CD to speakers" and it
would all happen through a trusted channel without any protected media
passing through the player software at all. This software may not need
any certificates since it isn't direcly touching anything that is
protected.

Cheers
Bent D
 
R

Roedy Green

The level of trust required by the virus scanner makes it probable
that this functionality will primarily be part of the OS anyway.

MS put the guts of defraggers in the OS. Defragger manufacturers
compute on the UI and the ordering algorithms. But the tricky stuff is
all done by the OS. It might even be that defraggers never even see
the contents of any files anymore.

You could handle virus scanners the same way. The virus software just
provides the patterns to look for. or provides code that runs in a box
without access to any OS services or outside world contact, just a
boolean return.
 
L

Luc The Perverse

Roedy Green said:
MS put the guts of defraggers in the OS. Defragger manufacturers
compute on the UI and the ordering algorithms. But the tricky stuff is
all done by the OS. It might even be that defraggers never even see
the contents of any files anymore.

You could handle virus scanners the same way. The virus software just
provides the patterns to look for. or provides code that runs in a box
without access to any OS services or outside world contact, just a
boolean return.

It's the same thing - you allow programs to access directly or indirectly
the code inside the black box and it can be extracted in entirety.

I could make a function which uses a search function to decode a string.
In fact it would be pretty easy. Make a fake virus scanner - Start with one
byte, if it is not found search for another byte, now build the byte string
backwards until you can't add any more characters, then build it forward
until you have reconstructed the unknown string - that string, in this
instance is the program, function or media desired. It might be a little
more difficult if you get false positives, but I think that algorithm would
work.
 
L

Luc The Perverse

Bent C Dalager said:
I doubt it would be taken quite this far. More likely, every time you
launch Eclipse the OS will show you large animated scare-windows
saying things like "you are trying to start an untrusted application",
"this application may or may not be virus infected" etc.

Microsoft tried something similar making several ISP's not work with their
windows 95 operating system mysteriously around the same time the MSN
service appeared.
 
C

Chris Uppal

Bent said:
[...] but it's a safe bet that a music player will need much lower
level of trust than what a virus scanner will. The buggy music player
may only let the user copy music (perhaps) while a buggy virus scanner
can be a lot more problematic.

I don't see where virus scanners fit into this picture at all. If all the
executable software is signed-and-sealed, what would the virus vector be ?

The only one /I/ can think of is the old evil-macro style of malware. If there
are any applications which execute code which isn't machine code (which contain
script languages in some broad sense) then the scanner can look for infections
in their data files without special privileges. If the data could contain
malware, but is not accessible with "normal" privileges then how did the
malware got into it in the first place ?


[re: cracked priveleged apps]
When this was known to have happened, the certificates of the cracked
software would be revoked, making it problematic for the general
public to use it: Once they find a need to upgrade/patch their
computer (or just connect to the Internet), the OS will download the
revocation list and automatically reject the suspect software.

I can't really see this. I think the kind of "computer" we are talking about
here would no longer be the general purpose computation device that we are used
to (and love), but an /appliance/. It would have certain abilities built into
it, but would have little or no support for field update. Certainly no support
for updates by the /user/. This is the approach that MS are currently
experimenting on with X-Box.

-- chris
 
B

Bent C Dalager

Bent C Dalager wrote:

I don't see where virus scanners fit into this picture at all. If all the
executable software is signed-and-sealed, what would the virus vector be ?

I am sure they will think of _something_ :)
The only one /I/ can think of is the old evil-macro style of malware. If there
are any applications which execute code which isn't machine code (which contain
script languages in some broad sense) then the scanner can look for infections
in their data files without special privileges. If the data could contain
malware, but is not accessible with "normal" privileges then how did the
malware got into it in the first place ?

Some applications would presumably be able to process both protected
data and unprotected data from general sources (e.g. the
Internet). Let's say a trusted web browser that can play protected
music files and it can display any old JPG from the net. The JPG could
contain a buffer overflow exploit that compromises the web browser,
perhaps storing its virus code within the browser's configuration
data.

While you may find it difficult to compromise an application's binary
file without compromising the OS itself, this could certainly be
possible. Let's say that the browser you compromised turns out to be
an intregral part of the OS and that this effectively lets you access
application binaries through system calls. You might know of a
specific application whose certificate you have cracked by some means
or other and you can then modify this application to contain your
virus code.
[re: cracked priveleged apps]
When this was known to have happened, the certificates of the cracked
software would be revoked, making it problematic for the general
public to use it: Once they find a need to upgrade/patch their
computer (or just connect to the Internet), the OS will download the
revocation list and automatically reject the suspect software.

I can't really see this. I think the kind of "computer" we are talking about
here would no longer be the general purpose computation device that we are used
to (and love), but an /appliance/. It would have certain abilities built into
it, but would have little or no support for field update. Certainly no support
for updates by the /user/. This is the approach that MS are currently
experimenting on with X-Box.

I am not sure what you mean by "updates by the user".

I do not have an X-Box, but I do have a PSP and it is quite clear that
Sony is doing what it can to lock down the hardware and prevent me
from installing anything that is not Sony-approved. Bugs aside, this
is a strategy that could certainly work quite well. I won't be able to
do all that I would have liked to with the hardware, of course, but I
will be able to install new software on it so long as it has Sony's
approval. In this sense, the user can update it in the field.

It is likely that a trusted PC will be a little less strict than this
and that it will provide some sort of sandbox mode for untrusted
software, meaning that as a user I probably _could_ install anything I
wished. Of course, non-trusted software might not have the
capabilities that trusted software would, but them's the breaks.

Cheers
Bent D
 
T

Timo Stamm

Bent said:
When this was known to have happened, the certificates of the cracked
software would be revoked, making it problematic for the general
public to use it: Once they find a need to upgrade/patch their
computer (or just connect to the Internet), the OS will download the
revocation list and automatically reject the suspect software.
Legitimate purchasers will get it back in working order again on the
next product update.


What happens if someone finds an exploit in the software while the
company is in business?

Are the certificates revoked? That would very likely put them out of
business.

Also, Microsoft would be unlikely to give a crappy garage company the
licenses it needs to access other applications at its whim.

Do you know how most of the big players in IT started? Innovation comes
from the "crappy" garage companies. The industry will stagnate if you
hand the control over to the established companies.


Timo
 
C

Chris Uppal

Timo said:
The industry will stagnate if you
hand the control over to the established companies.

Agreed, but since when have big players in any industry refrained from actions
that bring them short-term advantage even at the cost of destroying the entire
industry ?

-- chris
 
B

Bent C Dalager

What happens if someone finds an exploit in the software while the
company is in business?

Are the certificates revoked? That would very likely put them out of
business.

Legitimate customers would have their applications patched through
automatic update, probably before the revocation list is updated. This
should provide a seamless experience for the large majority of people.
Do you know how most of the big players in IT started? Innovation comes
from the "crappy" garage companies. The industry will stagnate if you
hand the control over to the established companies.

This is why I said it should be interesting to see how non-trusted
software projects would fare in a new trusted world. Chances are these
are the projects where you will find the innovation and perhaps even
the quality, but this remains to be seen.

I expect that we are heading into a couple of decades of trial and
error where we will eventually find that wholesale market lockout of
non-approved developers is a really really bad idea. I may be wrong of
course - we will find that out in due time.

Cheers
Bent D
 
T

Timo Stamm

Chris said:
Agreed, but since when have big players in any industry refrained from actions
that bring them short-term advantage even at the cost of destroying the entire
industry ?

That's capitalism. Garage companies are no different, they also strive
for short-term advantage, maybe even more than the big players.

Getting rid of competitors is good for a company, but it's bad for the
customers. Just think about Microsoft Visual Studio, which is available
without charge today, mainly because free IDEs like Eclipse are serious
competition. On a Windows Computer with a complete DRM chain, you would
probably be unable to run Eclipse.


Timo
 
B

Bent C Dalager

Getting rid of competitors is good for a company, but it's bad for the
customers. Just think about Microsoft Visual Studio, which is available
without charge today, mainly because free IDEs like Eclipse are serious
competition. On a Windows Computer with a complete DRM chain, you would
probably be unable to run Eclipse.

I doubt it would be taken quite this far. More likely, every time you
launch Eclipse the OS will show you large animated scare-windows
saying things like "you are trying to start an untrusted application",
"this application may or may not be virus infected" etc.

Cheers
Bent D
 
R

Roedy Green

I could make a function which uses a search function to decode a string.
In fact it would be pretty easy. Make a fake virus scanner - Start with one
byte, if it is not found search for another byte, now build the byte string
backwards until you can't add any more characters, then build it forward
until you have reconstructed the unknown string - that string, in this
instance is the program, function or media desired

My idea though is to isolate that code so that even if it does snoop,
it can't tell anyone what it found. Normally the snoop for example
would hide the information it found on disk somewhere, or squirrel it
away in RAM or CMOS, or emit a packet to the Internet. It would run in
a very restricted sandbox.

Since you are to die Mr. Bond, I can tell you my secret plan to
destroy planet earth...
 
R

Roedy Green

I am sure they will think of _something_ :)

During start up, your computer behaves identically to the way it did
in DOS days. That is one place it is quite vulnerable. A floppy
accidentally booted from is God.

The key to the virus problem is digitally signing everything, and not
allowing low level access to an executable file, and never running
without checking that its signature has been previously verified.

A virus then has to infect prior to the exe being signed during
development.

That still leaves trojans of all forms. Now we get to requiring all
developers to submit DNA (their public keys) to the feds so that
computers can makes sure software came from a reputable vendor and
they know who to sue if software causes damage. From a legal point of
view with not that much change to OSes, you could prove whose software
formatted the hard disk, or deleted or overwrote a given file, or
better still keep each vendor inside his own sandbox where he can't
even see the files or registry entries from other vendors.
 
J

Jeffrey H. Coffield

Roedy said:
During start up, your computer behaves identically to the way it did
in DOS days. That is one place it is quite vulnerable. A floppy
accidentally booted from is God.

The key to the virus problem is digitally signing everything, and not
allowing low level access to an executable file, and never running
without checking that its signature has been previously verified.

A virus then has to infect prior to the exe being signed during
development.

That still leaves trojans of all forms. Now we get to requiring all
developers to submit DNA (their public keys) to the feds so that
computers can makes sure software came from a reputable vendor and
they know who to sue if software causes damage. From a legal point of
view with not that much change to OSes, you could prove whose software
formatted the hard disk, or deleted or overwrote a given file, or
better still keep each vendor inside his own sandbox where he can't
even see the files or registry entries from other vendors.

I find your point of view interesting in light of the fact that you are
one of the best sources of programming information around. I have been
supporting business systems for 26 years on computers that have never
had anyone (including myself) figure out how to write a virus for
because the hardware simply does not allow any user program access to
any system memory.

For people who like conspiracy theories, which type of computer would a
hardware vendor want you to buy? One that does everything you need (I
have programs currently running with HTML/Javascript front ends that
were written in 1980 that still work because the business model hasn't
changed) or one that eventually gets so infected with viruses, worms,
spyware, etc. that you just buy a new computer?

Jeff Coffield
 
L

Luc The Perverse

Jeffrey H. Coffield said:
For people who like conspiracy theories, which type of computer would a
hardware vendor want you to buy? One that does everything you need (I have
programs currently running with HTML/Javascript front ends that were
written in 1980 that still work because the business model hasn't changed)
or one that eventually gets so infected with viruses, worms, spyware, etc.
that you just buy a new computer?

If I may . . comment on a subset of your post ignoring your specific
question . . . If not please disregard.

I assume you are alluding to Linux vs Windows or _insert believed superior
operating system here_ vs Windows.

The only thing I really have an objection to is - if you are buying a new
computer instead of just formatting/reinstalling your system that is a very
serious inefficiency - which you might want to address. And it is
possible to run windows without getting viruses and spyware - typically it
is the user that is installing those things. (I realize there are some
backdoors but it has never inhibited my ability to use a computer. The
only virus I have ever gotten is as a result of my own idiocy.)

So maybe the question is - do you want an operating system which protects
you by binding your hands - or an operating system which allows you and your
programs free reign? It's this free reign in the hands of novices that
causes problems.
 
R

Roedy Green

I have been
supporting business systems for 26 years on computers that have never
had anyone (including myself) figure out how to write a virus for
because the hardware simply does not allow any user program access to
any system memory.

what are you referring to?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top