Thank you,
but from by reaserch i got these requirements ......
Python, django, Twisted, MySQL, PyQt, PySide, xPython.
*Technical proficiency with Python and Django.
Web application framework
*Technical proficiency in JavaScript.
Client-side web application
*Experience with MySQL / PgSQL.
Relational database -- unless you need to fully administer the DBMS or
use direct/obscure commands, knowing generic SQL may be enough (note that
Django will likely be using it's own ORM package so even SQL may not be
needed)
Well... that implies being fluent in the OS (probably at the shell
scripting level).
*Experience with MVC design patterns and solid algorithm skills.
While I know the term, I've not had much experience with the
application... Separation of the data (model) from the user interface
(view) and the logic linking the two (controller).
Algorithm is another matter (the word basically is equivalent to
"recipe").
Core Python, DJango Framework, Web2Py, Google App engine, CherryPy ( Basic Introduction)
The problem for me is whether i have to learn all these technologies to work as a python developer......
Django, Web2Py, GAE, CherryPy are all focused on /web-based/
(HTTP/HTML) applications. Python is just the implementation language.
If the goal is just pure Python none of those may be applicable. For
example, my most recent Python task was to generate UDP data packets to be
fed through "Cross Domain Solution" boxes... I had to generate packets of
various sizes, with some variation of contents [stuff that was supposed to
trigger "drop" or "edit" actions in the CDS box]. Wireshark was used to
capture the out-bound packets and the CDS-passed in-bound packets. Python
was used to match the Wireshark captures to produce an SQLite database.
Another Python program then extracted the latency data [outbound timestamp
vs inbound timestamp] for the packets and create a CSV file for Excel
plotting.
That's three Python programs, yet none are "web" related. They required
an understanding of the socket library, threading [the SQLite database
relied on threads to read the two Wireshark capture files, filtering out
all but the packet time-stamp and data ID string, and a third thread to
match the out/in packets for latency -- and reporting any missing packets],
and CSV library. Oh, and development of algorithms to do that processing.