OOP and Procedural Decomposition

The purpose of defining my paradigms of coding conventions is for when I have to make a decision on how to implement... I'd like to have a consist way to deal with things, and modify this document as needed (learning from mistakes/experience).

The purpose of releasing this document to other developers is in hopes for you to point out my fallacies... This document may sound bias, but I am willing to listen to anyone's opposing argument with an open mind. This whole process should be beneficial to both myself and the reader to really stop and think about what is the most effective choice when it comes time for R&D.

OOP vs Procedural Decomposition-

A couple of years ago I was totally against OOP. A friend once defined this as the chicken theory... where someone gets comfortable with the way things are and then something new comes along and its learning all over again. However now I see the power behind it... the performance issues that I was concerned about are really minimal if you carefully manage your objects. There are still some oop conventions/views that I do not share like using functions to manipulate variables. There should be no ambiguity (internally speaking), the whole foundation of C is the fact that it is a Typed language... the opposing argument to use functions to manipulate privates is to get away from being responsible for sticking to types, because if you decide to change them later you'll have to change everything. There is one concept a friend once mentioned and that being to avoid using variables all together to keep a function thread safe. The thing about this Idea is assuming you'll have 2 threads calling the same function with the same instance. Generally that is rare, but when using dlls and global functions that may be an issue... in which case to pass parms is the same effect as allocating an instance... or manually allocating memory to pass a struct. Other Ideas which oppose keeping typed standards are the concepts of overloaded functions, Parametric Polymorphism, virtual functions, and dynamic binding. These tend to push elegance as a higher priority than efficiency and robustness. I've finally found the power of using virtual functions, and if not abused can be very beneficial even in performance issues because to switch conditions vs. jumping directly to the function saves time.

Now I do support the Idea of Data Abstraction, Encapsulation, and Design. To touch on Design, I like the Idea to define what the Classes/Objects are. If you think about it in Standard C, the word was "Modular" and... "This program is modular designed".... The fact that Classes are separated header and source, is a direct reflection of Modular Design. I Also like the Composition/Inheritance Idea which helps ease the translation of an Hierarchy design. However I must insist that the foundation of each layer should ultimately be Assembly, and Standard C, with very simple register parameter passing for each function. The Tags, Tag list, Object<-(passing a structure to the function). Should all be the foundation very tight straight forward shared resident libraries. Using this lowest level for the Application programmer reflects a very similar design of the OS programmer. I believe the Application programmer should have a generic API of his own if he wishes to be portable across Operating Systems. This all being done using Standard C. So The Design should be a combination of Top Down, using Hierarchy defining objects, and a bit of (flowcharts/recursive descent). With a bottom up building of generic tools and resources using standard C, with a minimal use of Parameters. Finally build each up as needed. The foundation can then become a new shared library, better yet if the library is small keep it static to avoid compatability issues, as memory now days is abundant.

What would be nice is if the application would completely be self contained in a single directory! I don't see any reason why any file should have to leak out of the directory... The only exception should be things which can be shared with other applications like fonts, libraries, clipboard and other, but then even still these things should be able to be found in the current directory and have priority over things in the shared directory for the sake of version incompatibilities across applications which need different versions of the same shared resource. One thing I must point out is I think its more professional to make a program rom-able. Just to think if programs and OS were rom-able and if such media exists to where the user could disable any writes to the these apps... This would be the sure thing to fight against viruses... Now realistically the environment prefs, can't be rom-able... but then there should be defaults for NULL, as well as a writable path to set them, The Amiga put the prefs in ram (Env: EnvArc:) and gave you the choice to use prefs or to save them permanently what a cool concept. I believe the registry idea that PC and Mac use is asking for trouble because it is the perfect place to plant viruses. The Environment prefs should be it's own file... so that the user can cleanly delete them when it comes time to uninstall. Lets keep it simple...

Cache-control: no-store