This is topic programming in forum Ask a Geek! at The Geek Culture Forums.


To visit this topic, use this URL:
http://www.geekculture.com/cgi-bin/ultimatebb/ultimatebb.cgi?ubb=get_topic;f=12;t=002017

Posted by stevenback7 (Member # 5114) on May 05, 2006, 15:06:
 
okay. so from my previous question i learned that i need to get myself into a project of some sort. Question is whats the best kind of internet project to get invovled in and secondly out of the computer languages like perl,lisp,python, etc which one is the one most commenly used. i've learned turing, but turing is more a program used by schools, something what isnt widely adopted by programers.
 
Posted by drunkennewfiemidget (Member # 2814) on May 05, 2006, 17:48:
 
#1: Getting involved in a project might be difficult if you don't know any of the languages yet. I suggest, as Rhon has told you before, to look into O'Reilly's books. Most of them are very very good.
#2: Asking which is most popular is probably grounds for a holy war. Everyone will give different answers. Technologies in use on the web today include (but are not limited to): Java, Perl, Python, PHP, Ruby, Ruby on Rails, VB, VB.Net, C#. Just try each and find which one you like.
#3: I was going to suggest a good way to get into a project and help out for now while you're learning would be to try maybe writing some technical documentation, as good technical writers are in short supply. That being said, however, you'd have to work on your spelling and grammar quite badly.

Good luck.
 
Posted by stevenback7 (Member # 5114) on May 05, 2006, 17:56:
 
thanks, yeah i know my spelling is bad in forums, but honestly i'm quite good in writing, Yes i know deciding which program is the best would be a holy war,because honestly they are all pretty good (it was a dumb question). Right now i am trying to re-master HTML, and learn Python again.
 
Posted by FireSnake (Member # 1181) on May 05, 2006, 19:23:
 
I'd suggest C, it's a classic, after all. And don't screw around with IDEs and all that garbage. Get yourself a nice little command line compiler and a copy of Vim. (All you Emacs people can suck it. [Wink] )

There's a version of Vim for Windows, too.
http://www.vim.org/

Also, install Cygwin (for gcc, a C compiler).
http://www.cygwin.com/

And I'd suggest Perl, too. It's quite handy.

You can get ActiveState's here:
http://www.activestate.com/Products/ActivePerl/
(Yes, it's for Windows.)


A good site for programming links:
http://www.programmingtutorials.com/

And a couple others:
http://www.cprogramming.com/
http://learn.perl.org/
 
Posted by stevenback7 (Member # 5114) on May 06, 2006, 10:24:
 
k thanks for the advice. I will attempt to learn C, as soon as i get another computer, ( i really don't want to start messing around with my other pc)
 
Posted by Metasquares (Member # 4441) on May 06, 2006, 19:54:
 
Learning C (and C++ or Java, to get an idea of what OOP is like) will help you learn related languages very quickly, but as far as using a language goes, that decision should be made on a case-by-case basis given the requirements of your project and the ease which you can fulfill those requirements in a certain language. Don't be afraid to learn new languages if you don't know the optimal language for a project, so long as you have enough time to correct a few mistakes and learn the language well.

A common misconception is that programming is the art of learning to use a programming language. Maybe that will work to a point, but being a Good Programmer involves learning to use a bunch of non-language tools to their fullest as well. You should know, hands down, how to use:

* At least one programming language.
* At least one editor on each platform you target.
* At least one debugger on each platform you target.
* How, why, and when to use version control systems such as CVS or Subversion.
* Utilities that will help you analyze the output of your program (things like grep, more, find, etc.)
* Optionally, utilities such as code profilers, optimizers, or memory profilers.

You know you're growing as a programmer when your focus shifts from syntax of the language to semantics - when you no longer think of how you're going to implement something in code and start thinking about implementing it in terms of an independent approach. As you become a better programmer, the details of the language will matter less and less.

I've never heard of the Turing language before. It seems like yet another dialect of Pascal. That's an excellent foundation language as well.
 
Posted by nerdwithnofriends (Member # 3773) on May 06, 2006, 23:04:
 
quote:
Originally posted by FireSnake:

.
.
.
(All you Emacs people can suck it. [Wink]
.
.

i <3 u
 
Posted by stevenback7 (Member # 5114) on May 07, 2006, 11:04:
 
quote:
I've never heard of the Turing language before. It seems like yet another dialect of Pascal. That's an excellent foundation language as well.

Yeah Turing is one of those computer languages which likes to be taught in Canadian Highschools, its very similar too other languages like Python.
 
Posted by GameMaster (Member # 1173) on May 07, 2006, 16:12:
 
FS,

I'm an equal oppertunity editor user. I used nano, vi and emacs. vi has the largest learn curve for people comming from windows. nano/pico are a fair middle ground, and I often use it over ssh because the universities CS servers have weird key-mappings, and the vi commands don't quite work with them, and it's easier to use pico/nano than to remap the keys.

Emacs I use for writing larger things that are multi document. The split window are really useful, and setting a compile command and running it with a quick keystroke.

vi is what I use for quickly editing something like a script or file... Especially when I'm doing a "search" or "search and replace"...
 
Posted by Kinguy (Member # 4527) on May 08, 2006, 01:55:
 

 
Posted by uilleann (Member # 1297) on May 08, 2006, 05:13:
 
I'd never personally recommend C as a starter language just because it's so easy to make a trivial but fatal error. Even as my second language, I was causing my DOS PC to reboot every time I tried running code! (I didn't grasp pointers for some reason and I was doing something that triggered an instant reboot of DOS.) Then there's the classic one, strcpy or memcpy into a char array that's maybe just one character too short. I did some maths wrong once on how big a buffer should be (to back up screen content behind a window) and during one execution I was treated to the most adorable torrent of pretty-coloured text, every text cell was a different foreground and background colour and character. Probably until I hit ctrl-alt-del.

C is a language for the very brave or the very dedicated. The bitter irony is that programs written in C (and worse, assembler) are typically the most reliable! (Visual Basic is all cuddly and cute and everything written in that dies with stupid errors if you just look at it wrong)

The most reliable language to date IMO is assembler; as I've said, 80s games never crashed, and they were far more complex than a lot of modern apps. Especially if you had no hardware sprites and no horizontal hardware scrolling and had to write all of that from scratch in about 16 k at 2 MHz. The mind boggles. We're all getting soft these days and it shows!
 
Posted by Doco (Member # 371) on May 08, 2006, 06:31:
 
Kinguy - I'll take the opposite end of that rant. Every new guy I end up training who has done nothing but applications written in visual studio has no idea what the hell they are doing with threads or semaphores, and their code leaks memory like a damn sieve. I work mostly with embedded systems and the apps are supposed to run continously for weeks/months/years. People on Windows platforms get so used to the idea that they can have a stack size in megabytes, and that the user will always be there to restart the application if it starts wigging out.
 
Posted by Doco (Member # 371) on May 08, 2006, 06:37:
 
Just to add my 2 cents for the original question - I would recommend Java as a starting language. The standard library and it's construction push you to avoid most of the common starting out errors that people have with C/C++ and pointers. I also recommend it as having a much better OO design viewpoint than C++. I personally prefer C++ for my work - but it is really too easy to go into just writing procedural code that isn't very OO, because the language and libraries allow you to do either.

However - if you ever want to play in my world of embedded systems - you will need to graduate to using C/C++ at some point. No other language has made much of an impact into that realm yet. (Yes there is Ada, and Assembly and others - but C or C++ by far the most dominate)
 
Posted by uilleann (Member # 1297) on May 08, 2006, 07:41:
 
Java's kinda weird. Psion managed to write a JVM for a 16 MB PDA, thus file size on disc plus heap cannot exceed 16 MB! Means that Sun has no excuse when stumbling on a Java applet in Firefox grows its heap by 50 meg ;)

And of course, you need room in RAM for Opera too, to surf the web and find those Java applets. And space for your other processes and files. In 16 MB. Woo.
 
Posted by angryjungman (Member # 2434) on May 08, 2006, 07:45:
 
I'm going to recommend Python. And I'm sure Spiderman will come along at some point and suggest Ruby [traitor that he is [Razz] ].
 
Posted by Spiderman (Member # 1609) on May 08, 2006, 07:55:
 
I'm going to recommend Ruby. [Big Grin]

Python is great, and I have nothing against it, it's just that Ruby is better. [Wink]

Seriously though, I know there are many opinions on this topic, but I believe that learning a simpler language like Ruby or Python will help you become familiar with programming concepts. Then go on to learn something like C.
 
Posted by GameMaster (Member # 1173) on May 08, 2006, 09:25:
 
Kinguy and U, I disagree.

Learning to manage your memory early, by making those mistakes, is vital to becoming a good programmer. Likewise, learning how to use a semaphore/mutex/condition variable properly is also VERY important.

Here's a few basic rules to make life easier:
- If you open it, close it when done.
- If you malloc it, free it when done.
- If you new it, delete it when done.
- If you spawn it, join it when done.
- If you lock it, do work with resouce, unlock when done as early as possible.
- If you've deleted or freed memory, point all pointers that used to point into it to null. Then don't touch them till they point somewhere "good"
- Allways aquire locks in same order.
- Allways release locks in reverse order from which you've aquired them
- sizeof is your freind.
- Put constants first in comparisions.
- watch for off by one errors.
 
Posted by drunkennewfiemidget (Member # 2814) on May 08, 2006, 09:55:
 
quote:
Originally posted by GameMaster:

- Put constants first in comparisions.

I agree with all of your points except that one. Code that compares with the constant first looks funny, and is counterintuitive imnsho.
 
Posted by uilleann (Member # 1297) on May 08, 2006, 10:27:
 
GM: Depends. If you're doing it at home, alone, and don't understand what you're doing, you'll eventually give up in despair [Razz] If you're starting out learning, it's not overly encouraging to be having lots of crashes and meaningless behaviour from things you don't and probably can't understand right then. It's one thing to talk about mutexes and so forth but it's another to understand all of this from day one. The significance of most of what you wrote is beyond beginner level.

Pure C is also very nasty -- it makes you use scanf for input, whose main purpose in life is to crash [Wink]

The ideal condition would probably be C, but under wise tutelage to help you figure out the lethal mistakes. And boy, can things go wrong. We had to write a Web server in C++ under *NIX, so somewhere I had a piece of code that looped, reading chunks of a file into a buffer and passing them to a socket via write(). If the file on disc was too large, the server thread would lock up in one of the calls to write() part-way through the process. Even the lecturer had no clue... why would write() crash?

At the very least you need a good book. Maybe two or three, so that if you can't follow explanations in one of them, you can try for another (or look online). C is good when you understand it, it's impossible when you don't quite. You'll be very well acquainted with the segmentation fault (or in DOS, the BIOS screen!) within short order [Smile]
 
Posted by Kinguy (Member # 4527) on May 08, 2006, 10:30:
 

 
Posted by chromatic (Member # 164) on May 08, 2006, 11:16:
 
C and Java are pretty horrible languages for beginners for different reasons. C is basically glorified and slightly-portable assembly and, unless you're the kind of person who finds the intricate and exceedingly-concrete details of assembly interesting, probably too foreign to your way of thinking to start.

Java is too verbose and exposes too many details of its way of thinking from the start.

Ruby, Perl, and Python are simultaneously more expressive and less dogmatic. I like Perl the most and Python the least, but any of those three is a good place to start.
 
Posted by stevenback7 (Member # 5114) on May 08, 2006, 12:00:
 
Okay, thanks for all the advice. My one question is how many starter languages (python,perl,lisp,ruby,etc) should i be familiar with before tackling C,C++, and java?
 
Posted by Spiderman (Member # 1609) on May 08, 2006, 12:15:
 
quote:
Originally posted by stevenback7:
Okay, thanks for all the advice. My one question is how many starter languages (python,perl,lisp,ruby,etc) should i be familiar with before tackling C,C++, and java?

I think you'll know when you're familiar enough to start tackling more difficult topics. Things will start to just make sense.

YMMV
 
Posted by Metasquares (Member # 4441) on May 08, 2006, 13:30:
 
quote:
Originally posted by uilleann:
I had a piece of code that looped, reading chunks of a file into a buffer and passing them to a socket via write(). If the file on disc was too large, the server thread would lock up in one of the calls to write() part-way through the process. Even the lecturer had no clue... why would write() crash?

write() throws a SIGPIPE if it gets disconnected at any point. That might be your problem.

Did you try using send()/recv() instead of write()/read()? Not that it should make much difference.
 
Posted by uilleann (Member # 1297) on May 08, 2006, 15:06:
 
Metasquares: No, I was using a browser on the same machine, which would just stop receiving data [Smile] It was just being silly. Goodness knows.

I think the weirdest thing I've found to date is BBC BASIC (80s home micro language), which is a typed language where functions however have no return type. So I can write a function that can return a string, float or int depending on what it feels like, so long as I use the right kind of variable to receive that result each time I call the function. Else the program will crash. A rather strange oversight in the language I think! (Though it was years before it dawned on me that this was possible and I proved that it was. So maybe it was a deliberate attempt to reduce complexity!)
 
Posted by Metasquares (Member # 4441) on May 08, 2006, 16:44:
 
quote:
Originally posted by Kinguy:
Failing with a giant boom and a stack trace is much more useful then just silently accepting it.

One of my professors used to call that principle "blow up early, blow up often, and blow up loudly".
 
Posted by Kinguy (Member # 4527) on May 08, 2006, 21:12:
 

 
Posted by GameMaster (Member # 1173) on May 08, 2006, 21:42:
 
quote:
Originally posted by drunkennewfiemidget:
quote:
Originally posted by GameMaster:

- Put constants first in comparisions.

I agree with all of your points except that one. Code that compares with the constant first looks funny, and is counterintuitive imnsho.
I used to agree. But here's why I changed my mind.
code:
for(int i = 0; i = 20; ++i)
{
cout << i << endl;;
}

Compiles. And runs. Exactly like you told it to... forever, and ever, and ever...

Where as:
code:
for(int i = 0; 20 = i; ++i)
{
cout << i << endl;
}

won't compile.

Fail fast, it's one of the few places C/C++ lets you fail fast without adding anything extra.

Meta, I've heard the same thing from one of my profs... He's working on research about fail fast and uniqueness.

Streams in C++ are horrible in that they set a flag and quietly go on running... They should throw an exception, IMHO.
 
Posted by Metasquares (Member # 4441) on May 09, 2006, 07:57:
 
stream.exceptions(ios::badbit | ios::eofbit | ios::failbit);

There you go; now they'll throw [Smile]

IMO, the benefit of writing assignments constant first doesn't outweigh the loss of readability that results unless you are particularly prone to making that mistake. It doesn't happen often.
 
Posted by GameMaster (Member # 1173) on May 09, 2006, 11:43:
 
5 > 3 and 3 < 5 carry the same meaning in my eyes, as does i > 3 and 3 < i. I have no problem reading either way. Especially when talking about is equal ==.

Thanks for the tip about making the streams throw... I wish they were on by default though, because when ever I pass the lab there is always someone with an infinate loop because they tried to use
code:
int a;
while(!cin.eof())
{
cin >> a;
//stuff
}

Then give a char or string. The error flag flips, but eof isn't true. The stream stops taking input and causes the body of the loop to happen infinatly. I constantly see students do it, and I remember doing it myself. If it had just thrown, I would have realized, always get as a string and convert later.
 
Posted by Metasquares (Member # 4441) on May 09, 2006, 19:18:
 
You can actually make that:

while (cin >> a) {
//stuff
}

and the loop will exit once the stream hits EOF (or the other bits get set). That's one of the most useful idioms I've seen in the language.
 
Posted by GameMaster (Member # 1173) on May 10, 2006, 16:40:
 
quote:
Originally posted by Metasquares:
You can actually make that:

while (cin >> a) {
//stuff
}

and the loop will exit once the stream hits EOF (or the other bits get set). That's one of the most useful idioms I've seen in the language.

Yes, you can, but most of the people in the begining programming courses that "... have a question, it'll just take a minute." don't know that yet... More important, they won't realize exactly why that works...

>> operator returns the stream. The if is looking for a bool, and the first member in the istream and ostream classes is the bool for the error flag. Which is implicitly converted to a void *, which C/C++ allows to be cast as the first member in a struct or class... The fact that there is a lot of implicit casts make it not the ideal solution to tell the student who is doing (the trouble code -- same as above):

code:
int a;
while(!cin.eof())
{
cin >> a;
//stuff
}

and in a low level course. You'll be there a for a year explaining how 'cin >> a' is a cast bool. Especially if they haven't seen casting yet.

Moreover, your code doesn't fix the actual problem that causes the infinite loop. The fact that a char was typed when expecting a number. What I generally advise, is always read as a string and use atoi(). For anything real, where the people ooking at are coders and not early students, the readability of "while(cin >> a)" or "while(getline(cin,a,'\n')" is nice (given a is a string).

Another thing I sometimes do help readability is:
code:
#define EVER (;;)

for EVER
{
}

That way, it's clear that loop is meant to run forever and didn't just forget to fill in the rest of for loop.
 
Posted by Metasquares (Member # 4441) on May 10, 2006, 18:34:
 
Oh, you meant it freezes when they type a something non-numeric when it's expecting a number. You're right, that won't fix that problem.
 


© 2015 Geek Culture

Powered by Infopop Corporation
UBB.classicTM 6.4.0