Coding usage and misusage, please lets talk about it

Hi Guys

This is by no means an attack on you guys for less then good coding, but I have noticed alot of conventions being broken in much of the codes on codeshare and the wiki.

I see alot of


String someString = "";

where this should be


string someString = string.Empty;

Why??

Well there are 2 misuses here.

  1. String with big S yes, the is a complex type representing a string with small s, which will wrap the string in a String thus consuming more ram and take up more cpu cycles to create.

  2. By using ā€œā€ you create a new string instance pointing to its own string representation of ā€œā€ (empty string), where by using string.Empty your string will point to the same representation of an empty string which will take up much less memory allocation.

The other thing I have noticed is many uses of complex types such as Int16 or Int32 or the like, An Int16 IS an short so use a short not an Int16 because the same problem as with the string (part 1) applies here.

I would always recomment the use of Resharper which will help in this regard aswell as StyleCop (which is compatible with resharper), these tools helps out with naming conventions and less then good code.

I know many uses _ for class or instance variables, where as this is well seen in C++ IT IS NOT good code in C#, read the code conventions for Camel Casing Camel case - Wikipedia.

I urge you guys to please adhere to those conventions as this will make it a 100 times easier for other developers to continue the work.

My best wishes.

Once again this is not ment to attack anyone, I only wish to drag this out in the light and make us ALL better coders :slight_smile:

Please lets talk about this and tell us what you think and why you think this.

#################################################

Pop in the TinyCLR community IRC channel and ask/chat

about whatever you want.

Details here: http://www.tinyclr.com/forum/topic?id=7210

#################################################

2 Likes

Keep in mind that many users here have an embedded system background in C programming where short and INT16 meant exactly the same thing (#define INT16 short) so the feedback is helpful to aid others in know what is better and why.

I am well aware of this, but that is exactly why this is needed, because C# is most certainly not C or C++, because the simple types are named the same in both c/c++ and c# the complex types in C# is always a sort of wrapping on the simple type, so therefor it is very much nessecary to know where the differences are, and comply with them.

That is one of the reasons why I started this post.

PS: Gus could you move this post to General Discussions, as I just realized it is in the wrong category.

Where did you get that information? short is an alias of Int16, int is an alias for Int32, string is an alias for String. There is no wrapper class or additional memory usageā€¦

In fact, Int16 is the correct .NET type and short is ONLY an alias.

While I studied programming languages.

We had a really big discussion on this subject and my university teacher, explained this in debths, I dont remember the exact tale, but you can actually see differences by timing the creation of a short and the creation of a Int16 (well you wont ever notice it).

C# will wrap short in an Int16 object when created as an Int16 but ofcourse C# is cleaver enough to know that a Int16 can be casted directly to a short.

Iā€™ve been using c# for as long as it has existed, and before that it was C++ and Java, so have a fairly good knowledge with programming and C# especially.

Just to explain my point abit more, when using ReSharper and StyleCop, youā€™ll get a warning when using Int16, and the warning says something like ā€œUse the build in type insteadā€.

While ReSharper is a 3rd party software, StyleCop is maintained by microsoft, and I believe that it is StyleCop that complains about this.

int is an alias for Int32, read it here http://msdn.microsoft.com/en-us/library/s1ax56ch(v=vs.71).aspx

Whoever told you that is wrong. I want you to declare an int and to get its type by using typeof or GetType().

Well Okay, I did go and read about it and it seems I am misinformed in this regard, but never the less, I am correct in my oppinion about people need to know more about the differences between the languages (Including me).

Conventions are there for a reason.

I just wanted to make people away that there are conventions, and differences that should be keept.

Note that I prefer the use of simple types too, but I would really doubt that using Int16 instead of short would create slower code.

At the point in time where we discussed this in class, I made a code sample that would measure the time difference, and I did se a slight difference, but I dont remember if it was consistant or not,

If it wasnt consistant it could well have been differences in timing from the windows schedueller.

But regardless of this, the other thing I mentioned about strings, that is very much the fact.

A good convention, for demonstrating a point with a numerical value, is to reference your supporting source. :slight_smile:

Ehm I didnt get what you were saying. :stuck_out_tongue:

I believe Mike is saying site your sources. Otherwise we can all just say ā€œdo this and itā€™s a bajillion times fasterā€ all we like with no proof.

Ohh but I have codes in the code share, and yes I understand that I could have provided some source code for showing what I was refering to, but I provided a link to the conventions Microsoft is recommending.

I always try my very best to code adhere to the conventions given for the language in question.

An example could be

Javascript


var someInt = 10;

response.write(someInt);

C# (Same code snippet)


var someInt = 10;

response.Write(someInt);

Yes it is the almost exact same code except for the casing.

As with C# all methods must start with UpperCase, where in Javascript I must start with lowerCase, languages are different so are the conventions used, and therefor we as programmers should try to always keep those conventions, to conform to standards, which I am ALL FOR.

Standards makes things easier and uniform.

Code not withstanding I believe he was more specifically referencing your statement
"I urge you guys to please adhere to those conventions as this will make it a 100 times easier for other developers to continue the work."

Either way, conventions are [em]good[/em] but shouldnā€™t been seen as the end all be all of coding. I personally donā€™t think many hobbyists are going to get confused seeing Int16 instead of short, nor care too much if it costs them an extra 3 cycles in their code.

Remember this is a learning community, not a corporation looking to squeeze every last cycle for a mission critical software. :smiley:


var someInt = 10;

I think you only should use ā€˜varā€™ when you explicitly have the type on the right side of the assignment operator. This is a bad example of using the var keyword :slight_smile:

I explicitly used the word ā€œurgeā€ to say you should use the conventions, however its still up to you to do so, but the Idear here was not to go into religion, just to drag the ā€œmissing conventionsā€ into the light.

This was never about confusion it has always been about trying to highten everybodyā€™s codeing abilities and quality even my own.

var is used expliceitly when the statment cannot be misinterpreted, like when asigning a number to an int. and infact in many other typeweak languages you would always use var to declare a variable.

It asumes the reader knows that the default numeric type is int.

Please drink the Kool-aid and repeat after me:

ā€œI pledge to always abide by the given code conventions to the best of my abilities.ā€ :slight_smile:

As Tom said, this is a learning community. Rather than starting a rant on the use of conventions, a better approach would
be to post a series of good coding practice posting so everyone could learn from your experience. You gave some good
examples.

Learning by example seems to be the favorite approach here.

2 Likes