Newbie: splitting dictionary definition across two .py files

K

kar1107

Hi,
I'm fairly new to python. I like to define a big dictionary in two
files and use it my main file, build.py

I want the definition to go into build_cfg.py and build_cfg_static.py.

build_cfg_static.py:
target_db = {}
target_db['foo'] = 'bar'

build_cfg.py
target_db['xyz'] = 'abc'

In build.py, I like to do
from build_cfg_static import *
from build_cfg import *

....now use target_db to access all elements. The problem looks like, I
can't
have the definition of target_db split across two files. I think they
reside in different name spaces? Is there any way I can have the same
dictionary definition split across two files?

The main reason is build_cfg_static.py is user generated and I want to
have build_cfg.py generated by a script. It helps to keep the two
dictionary entries separate.

Thanks,
Karthik
 
B

Ben Cartwright

I like to define a big dictionary in two
files and use it my main file, build.py

I want the definition to go into build_cfg.py and build_cfg_static.py.

build_cfg_static.py:
target_db = {}
target_db['foo'] = 'bar'

build_cfg.py
target_db['xyz'] = 'abc'

In build.py, I like to do
from build_cfg_static import *
from build_cfg import *

...now use target_db to access all elements. The problem looks like, I
can't
have the definition of target_db split across two files. I think they
reside in different name spaces?

Yes. As it stands, build_cfg.py will not compile to bytecode
(NameError: name 'target_db' is not defined).

Unless you're doing something ugly like exec() on the its contents, .py
files need to be valid before they can be imported.
Is there any way I can have the same
dictionary definition split across two files?

Try this:

# build_cfg_static.py:
target_db = {}
target_db['foo'] = 'bar'

# build_cfg.py:
target_db = {}
target_db['xyz'] = 'abc'

# build.py:
from build_cfg_static import target_db
from build_cfg import target_db as merge_db
target_db.update(merge_db)

--Ben
 
B

Ben Finney

I'm fairly new to python. I like to define a big dictionary in two
files and use it my main file, build.py

I want the definition to go into build_cfg.py and build_cfg_static.py.

That sounds like a very confusing architecture, and smells very much
like some kind of premature optimisation. What leads you to that
design? It's very likely a better design can be suggested to meet your
actual requirements.
 
K

Karthik Gurusamy

Ben said:
I like to define a big dictionary in two
files and use it my main file, build.py

I want the definition to go into build_cfg.py and build_cfg_static.py.

build_cfg_static.py:
target_db = {}
target_db['foo'] = 'bar'

build_cfg.py
target_db['xyz'] = 'abc'

In build.py, I like to do
from build_cfg_static import *
from build_cfg import *

...now use target_db to access all elements. The problem looks like, I
can't
have the definition of target_db split across two files. I think they
reside in different name spaces?

Yes. As it stands, build_cfg.py will not compile to bytecode
(NameError: name 'target_db' is not defined).

Unless you're doing something ugly like exec() on the its contents, .py
files need to be valid before they can be imported.
Is there any way I can have the same
dictionary definition split across two files?

Try this:

# build_cfg_static.py:
target_db = {}
target_db['foo'] = 'bar'

# build_cfg.py:
target_db = {}
target_db['xyz'] = 'abc'

# build.py:
from build_cfg_static import target_db
from build_cfg import target_db as merge_db
target_db.update(merge_db)

Thanks; it works great.

I also found using import inside build_cfg.py also works.

#build_cfg_static.py:
target_db = {}
#.. other dict entry definitions

#build_cfg.py:
from build_cfg_static import *
#.. more dict entry definitions

But I think using two different dictionaries and merging as you have
suggested is a better approach than the above way of an import file
importing another file. But doing the dictionary merge may incur
additional performance cost; but for my dataset size, it should be
okay.

Karthik
 
K

Karthik Gurusamy

Ben said:
That sounds like a very confusing architecture, and smells very much
like some kind of premature optimisation. What leads you to that
design? It's very likely a better design can be suggested to meet your
actual requirements.


I work in a source tree of 100s of .c files. The make file builds the
various .o files from the .c files. It takes a while to finish on the
first run. When I make changes to a .c file, I need to compile to get
the .o and also quickly fix any compilation errors. I don't want to
use the make infrastructure, as it takes a while to resolve the
dependencies.


So effectively I'm writing a python script as a poor man's make file.
On the first run of make, I capture the complete stdout. Then use a
script (another python one - build_generate.py) to grep thru' the
makefile's log to find out the exact gcc command used to get a foo.o
from its foo.c The script also captures the current working directory
-- the make is kind enough to spit out stuff like 'Entering directory
/blah/blah/path/to/binary....' It puts out the output in build_cfg.py
(it outputs in stdout, which I redirect)


I could get results of about 0.20 second completion for a .o file, when
the make file easily takes about 20 sec; that's a 100 times speedup.


I'm interested in only a few dozen .o files that I manage. So I run
the script to generate a dictionary of the form


target_db['foo.o'] = {
'cmd_cwd': r'/blah/blah/path/to/binary',
'cmd_str': r'/path/to/gcc <tons of options> -o obj-xyz/foo.o
.../blah/foo.c', #'redirect': 1,
# I can add any other flags I may think of
# In fact I'm planning to make the cmd_str as a list so that I can
# run a series of commands
}


In my build.py based on the target I give on command line (build.py
foo.o), I find the dictionary entry and using popen2.Popen3, execute
the corresponding cmd_str.


Now for every new source tree I pull or when make-file changes, I want
to rerun the script to generate the new dictionary.


I found build.py can also be used to automate other tasks - say pulling
a source tree. In general to run any command (or list of commands that
you issue from the shell prompt). These other tasks dictionary entries
are static; they don't change when makefiles options changes. That is
the reason I want to split the dictionary contents in two files. I
only have to change build_cfg.py everytime make-file changes/I use a
new source tree.


I'm not really worried about optimizations at this time; just want a
cleaner solution to my problem.

Karthik
 
B

Ben Finney

Karthik Gurusamy said:
Ben said:
That sounds like a very confusing architecture, and smells very
much like some kind of premature optimisation. What leads you to
that design? It's very likely a better design can be suggested to
meet your actual requirements.

So effectively I'm writing a python script as a poor man's make file.
[...]
I'm not really worried about optimizations at this time; just want a
cleaner solution to my problem.

There is SCons, a much-improved build system written in Python, that
may interest you.

<URL:http://www.scons.org/>

The build configurations are written in Python, so it seems to be
quite similar to what you're currently working toward.
 
A

Adam DePrince

I work in a source tree of 100s of .c files. The make file builds the
various .o files from the .c files. It takes a while to finish on the
first run. When I make changes to a .c file, I need to compile to get
the .o and also quickly fix any compilation errors. I don't want to
use the make infrastructure, as it takes a while to resolve the
dependencies.

Dependencies for 100 files?

- Adam
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,055
Latest member
SlimSparkKetoACVReview

Latest Threads

Top