C set-user-ID program wrapper for Perl script and security

P

Peter Michaux

I have a Perl script that I want to run as a set-user-ID program. Many
OSes don't allow scripts run as set-user-ID. To make this script
portable, it seems I need to write a C wrapper program that calls exec
or system to give the Perl script the necessary effective permissions.
How can I make the C wrapper program secure? or "more" secure?

The Perl script, which is "-rwsr-xr-x root root" will look at the real
user id and then check a permissions file that is "-rw------- root
root" to determine if the real user can carry out the subcommand to
the script.

Is it futile to attempt to solve my problem with a C wrapper program
around a Perl script? Writing this particular program all in C is
appealing from a purity point of view but I was going to be just
gluing together a bunch of command line tools like wget, chmod, tar
and a parser for YAML. Writing it all in C seems like overkill. If I
write this all in C then I suppose I need to find good libraries to
emulate all of these features.

Any suggestions?

Thanks,
Peter
 
A

Antoninus Twink

I have a Perl script that I want to run as a set-user-ID program. [snip]
Is it futile to attempt to solve my problem with a C wrapper program
around a Perl script?

Not at all. There's a discussion in the page got from
perldoc perlsec
and you might be able to find a wrapsuid script in your distribution
that generates a C wrapper automatically.
 
K

Keith Thompson

Peter Michaux said:
I have a Perl script that I want to run as a set-user-ID program. Many
OSes don't allow scripts run as set-user-ID. To make this script
portable, it seems I need to write a C wrapper program that calls exec
or system to give the Perl script the necessary effective permissions.
How can I make the C wrapper program secure? or "more" secure?

The Perl script, which is "-rwsr-xr-x root root" will look at the real
user id and then check a permissions file that is "-rw------- root
root" to determine if the real user can carry out the subcommand to
the script.

Is it futile to attempt to solve my problem with a C wrapper program
around a Perl script? Writing this particular program all in C is
appealing from a purity point of view but I was going to be just
gluing together a bunch of command line tools like wget, chmod, tar
and a parser for YAML. Writing it all in C seems like overkill. If I
write this all in C then I suppose I need to find good libraries to
emulate all of these features.

Yes, ask in either a Perl newsgroup or a Unix newsgroup. Standard C
has no concept of accounts or permissions.
 
A

Andrew Poelstra

Peter Michaux said:


No reason whatsoever. Malcolm is wrong. If you want to hardcode paths in a
C program, go to it. There will be an impact on portability (because the
path might not have the same semantics or might not even exist on another
machine), but that argument applies just as much to the Perl script.

Not really - a C program is almost always compiled, which means to change
a hardcoded path one needs to have access to the source code. By nature,
a Perl script is itself the source, meaning that any hardcoded paths are
going to be human-readable and mutable.
 
S

s0suk3

Speaking as both a C programmer and a Perl programmer, I can't think
of any good reason.

Perl is typically used as a "scripting language", i.e., a glue
language, a language to automate typical tasks, a test driver, etc.

C programs are usually targeted at more serious tasks and are
therefore better structured and developed with maintainability and
reusability in mind. Thus, there's no reason why such a program would
have hard-coded paths, except for example during testing.

Sebastian
 
K

Keith Thompson

Malcolm McLean said:
That's the answer. Hardcoded paths are a nuisance in a C
program. Typically C source is thousands of lines long, and the
executable might be detached from the source for use. So its a big job
to change the paths.
On the other hand users expect Perl scripts to contain
paths. Typically they are quite short and farm out most of the
"serious" work to other programs. Perl started as a glorified shell
script, after all.

Perl has its own newsgroups.
 
R

Richard

Malcolm McLean said:
That's the answer. Hardcoded paths are a nuisance in a C
program. Typically C source is thousands of lines long, and the
executable might be detached from the source for use. So its a big job
to change the paths.
On the other hand users expect Perl scripts to contain
paths. Typically they are quite short and farm out most of the
"serious" work to other programs. Perl started as a glorified shell
script, after all.

You have just stated that they exist. There is still no good reason why
hard coded paths *should* exist in Perl any more than they should in C.
 
R

Richard

Malcolm McLean said:
The alternative to hardcode paths is softcoded paths. However Perl is

err, quite.
already an evolved shell script. The major thing that the shell does
is to allow the user to specify files by path. By demanding softcoded

As do all file commands in C.
paths in Perl you are taking away the natural facilities of the shell,
and creating a dependency.

Rubbish. No one is demanding anything. Its just that hardcoded paths are
often questionable. One should calculate them based on querying the
system or constructing relative paths. "Hard coded" here I take to mean
a specific URL/link with root fixed.
On the other hand C is a compiled language. It is not possible to edit

Thanks for that too ....
the source and run, one must recompile. There might well be several

Uh huh. Recompile eh?
executables on different machines derived from the one C source. So
updating the C program when paths change is a major nuisance.

Updating ANY source when paths change is a nuisance. Hence you TRY to
make the sure the program has the information it needs to construct a
proper path regardless of system and install location.
Basically, write everything that doesn't need hardcoded paths in C,
then write a Perl script to drive your C programs, with the paths
coded into the Perl script where it is too much of a burden on the
user to put them on the commandline.

I don't necessarily disagree with that. But again, where hard coded
paths can be avoided in PERL/Python scripts too then do so. Hard coded
paths have been the bane of Linux adoption, for example, for years since
half the hacked up shell scripts to install certain SW assumes a certain
distro only to be run on a different distro where the paths are
nonsense.
 
B

Bart

Andrew Poelstra said:

Fair point - but let's compare like with like. If you're shipping source,
ship source, in which case the C program is just as human-readable as the
Perl script (and possibly *more* so, given typical Perl scripts!), and
just as mutable. All you need then is a C compiler (which is analogous to
the Perl interpreter).

Why do I always get this sinking feeling whenever I need to download
something and it's only available as C sourcecode instead of a ready-
to-go executable? Especially in some god-foresaken format like tar and
gz?

It's like the difference between buying, say, an amplifier ready-made,
and getting just the schematics, a pcb, and most of the parts.

By the time you've put it all together, and you're very lucky, it /
might/ just work.
 
A

Antoninus Twink

Why do I always get this sinking feeling whenever I need to download
something and it's only available as C sourcecode instead of a ready-
to-go executable? Especially in some god-foresaken format like tar and
gz?

By the time you've put it all together, and you're very lucky, it /
might/ just work.

Because it's been 10 or 15 years since you last tried it?

Seriously, the huge majority of software distributed nowadays as source
uses the GNU autotools - type the three famous lines

../configure
make
make install

and It Just Works.
 
P

Peter Michaux

That's the answer. Hardcoded paths are a nuisance in a C program. Typically
C source is thousands of lines long, and the executable might be detached
from the source for use. So its a big job to change the paths.
On the other hand users expect Perl scripts to contain paths. Typically they
are quite short and farm out most of the "serious" work to other programs..
Perl started as a glorified shell script, after all.

This doesn't seem like a very compelling argument to me. If a Perl
script is installed in /usr/local/bin like a compiled C program would
be, then no one should be editing the Perl file anyway. It should be
configured during the "perl Makefile.pl" step anyway.

How can a C program not contain at least one path compiled in? That
path being /etc/myapp or whatever sysconfdir was set to during ./
compile. All the other paths can be specified in the config file in /
etc/myapp but if the app cannot find the config it can't really get
going.

Peter
 
S

Stephen Sprunk

Peter said:
Why is that?

Two reasons:

1) Perl is almost always interpreted, while C is almost always compiled.
If a hardcoded path is incorrect, it is much less work to change it in
a Perl script (which just requires a text editor) than to change it in C
program (which you may not have the source for, and requires a full
development environment to recompile if you do).

2) Perl is used almost exclusively on POSIX systems, whereas C is used
across a much, much wider range of platforms. POSIX systems have a
common filesystem layout and pathname structure, which means it is much
less likely that a hardcoded pathname is incorrect in the first place.

S
 
K

Kenny McCormack

Because it's been 10 or 15 years since you last tried it?

Seriously, the huge majority of software distributed nowadays as source
uses the GNU autotools - type the three famous lines

./configure
make
make install

and It Just Works.

So goes the theory.

There's a fair amount of validity to Bart's position. Granted, it is
one of those things - it's free and it usually works. Whaddya want for
nothing? Things are much better with GNU autotools than they were in
the bad old days. Things are much better than that in the Windows
world, but, as we know, it doesn't always work in the Windows world either.

As a data point, I recently built DosBox from source. Everything went
fine - got all the pieces (SDL, etc) built with no error messages - and
the resulting executable "almost worked". The keyboard mapping was
wrong, and there was no way to fix it.

So, no, it doesn't always work.
 
K

Keith Thompson

Malcolm McLean said:
The alternative to hardcode paths is softcoded paths. However Perl is
already an evolved shell script. The major thing that the shell does
is to allow the user to specify files by path. By demanding softcoded
paths in Perl you are taking away the natural facilities of the shell,
and creating a dependency.
On the other hand C is a compiled language. It is not possible to edit
the source and run, one must recompile. There might well be several
executables on different machines derived from the one C source. So
updating the C program when paths change is a major nuisance.

Basically, write everything that doesn't need hardcoded paths in C,
then write a Perl script to drive your C programs, with the paths
coded into the Perl script where it is too much of a burden on the
user to put them on the commandline.

You are mistaken on several points, one of which is your misuse of the
term "shell script". If you're interested in the details, feel free
to e-mail me. This whole thing is entirely off-topic here in
comp.lang.c.
 
R

Richard

Keith Thompson said:
You are mistaken on several points, one of which is your misuse of the
term "shell script". If you're interested in the details, feel free
to e-mail me. This whole thing is entirely off-topic here in
comp.lang.c.


It is perfectly on topic to discuss how one does things in C related to
other languages.
 
J

James Kuyper

Bart said:
....
Why do I always get this sinking feeling whenever I need to download
something and it's only available as C sourcecode instead of a ready-
to-go executable? Especially in some god-foresaken format like tar and
gz?

I share your distaste for packages distributed as source code with
instructions on how to build it. However, I'm not too fond of
executables, either - most ready-to-go executables that I've seen aren't
"ready-to-go" on my machine. However, I've had very good results with
the packages I've installed that came in rpm format. However, if you
consider tar and gz as "god-forsaken", I suspect that you may have a
system where rpm won't do you much good, either.
 
K

Kenny McCormack

Malcolm McLean said:
That's the answer. Hardcoded paths are a nuisance in a C program. Typically
C source is thousands of lines long, and the executable might be detached
from the source for use. So its a big job to change the paths.
On the other hand users expect Perl scripts to contain paths. Typically they
are quite short and farm out most of the "serious" work to other programs.
Perl started as a glorified shell script, after all.

I think your choice of terminology may inflame some of the easily
enflamed members of this NG. I think what you are trying to say is that
the Perl language started out as an amalgam of shell and AWK (and
probably other things...) The similarity to shell is there, although
hard to see by some lights.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top