I started with MLs for 6800, 6809, 65xx and similar, and moved forward to Cobol, Fortran, and Pascal.
I still use Pascal to this day, albeit in its modern form (specifically FreePascal), and SQL, Perl and some others that are critical to IS.
I use C/CPP, regularly, because ZC is coded in C++03.
What are your goals?
I'll admit that I do not care for Python, and my disdain stems from how Python sees BEGIN and END tokens.
Let's look at some examples:
Using Pascal: var
a: Integer;
s: String;
a := Random(5)+2;
if ( a > 2 )
begin
s := 'Above two';
end
else
begin
s := 'Two or less';
end;
In C++:int a;
char s[15];
a = rand()%(5-3) + 2;
if ( a > 2 )
{
string s1 = "Above two";
strcpy(s, s1.c_str());
}
else
{
string s2 = "Two or less";
strcpy(s, s2.c_str());
}
In each of these examples, the beginning and end tokens for the statements are explicit.
If you remove the indentation, intentionally, or
accidentally, they still operate in the same manner.
Let's look at a trickier example: int a;
char s[15];
a = rand()%(5-3) + 2;
int b;
if ( a > 2 )
{
string s1 = "Above two";
strcpy(s, s1.c_str());
if ( a > 4 )
{
b = a*a;
}
}
else
{
string s2 = "Two or less";
strcpy(s, s2.c_str());
}
Here's the same code, without indentation:As you can see, it's harder to read and follow, but you can still clearly determine that the else belongs to the first if statement, not the second.
As I don't recall how to use strings in Python at present, I'll give a short example of the issue that I have with its syntax:
a,b = random.randint(2,5),0
if a > 2:
if a > 4:
b = a * a
else:
b = a * 2
Let's remove the indents
now.
a,b = random.randint(2,5),0
if a > 2:
if a > 4:
b = a * a
else:
b = a * 2
To which if statement does the else statement belong? In the first example, it clearly belongs to the first
if, but after stripping the indentation, it
now belongs to the
second if.
Perhaps there is some way to avoid this catastrope, but I'm not aware of it. I ran into it a few times when a web UI stripped the formatting out of a code block, and turned it into a mangled mess.
In a language with explicit begin and end tokens
of some kind, it's possible to reconstruct the intent. Without those, the intent dies a painful, agonising death.
Python does have some good stuff, but I'm not a fan of its syntax
in general. The loss of scope is just the icing on the cake, for me.
Again though, it depends on what you want to accomplish: It's a good idea to determine what best fits the task at hand.SQL is probably the best-paying.
Another tip:Learn C99 before you waste gobs of time on C++11.
[...]And don't be dogmatic. Programmers tend to think their own ways of doing things are right and everyone who disagrees is objectively wrong, and it annoys the hell out of me.
[...]
That's the absolute truth. I think that most of us end up as grumbling old buggers who think our way is the highest science.
I started "programming" with Multimedia Fusion. After that I learned Visual Basic in high school and Java, JS, PHP, XHTML, CSS, SQL, AS and C++ at uni. Since then I've dabbled with a few game engines - SDL for C++, Love for Lua (which I quite liked!), and I've recently jumped into GML for Game Maker (I've generally stuck with Clickteam for game dev but after a short foray into GM I think I'll make the switch. With the amount of coding I do it seems much easier to write a script than have the code all broken up into events.
I saw VB in most of the posts here. Why the heck is MSVB the first language that's being taught these days?!
It's never useful in any sense, for any kind of professional programming, and its elements don't carry over very well to other stuff.
I understand why they teach Java, which IMO, is awful.
To be honest, courses should start with simple C concepts, and perhaps a bit of Perl. The practices that people pick up when learning Java are why most programmers who start with Java write such utterly deplorable code.
Java is an over-rated, pile of doo-doo, and it has strayed so far from its original purpose that the very idea of it disgusts me. Absolutely everything for AndroidOS is coded in Java, and thus is extremely wasteful, and bogged down with nonsense.
I understand that it's portable as heck, but that is a gigantic trade-off in terms of performance, memory use, executable size, and other factors.
Further, apparently, universities are starting students off with stuff like Macromedia Actionscript, and Lingo. Honestly, WTF?
Why not teach HyperCard, while you're at it. :/