lunes, 18 de febrero de 2019

Generating machine code for entrance5.go


There are many aspects of computer science and history that we are oblivious of. While it is a young discipline when compared to others, it is still full of characters, stories, heroes, discoveries and inventions. One of these heroes is Grace Hopper. Personally, I didn’t know about her or her inventions. I am familiar with COBOL and had the understanding that it had been solely developed by IBM as a business opportunity for business clients. It’s nice to know the actual origins of a thing such as a programming language and find story with a human soul and intention behind it. I also think her general work and goal about bringing common people closer to computers is very admirable, worthy of someone being a role model at least.

From my personal experience, most people don’t like to code or have no particular interest in the area of programming or computer science, either because they think it is too difficult or because they don’t really care how computers actually work. Yet, it is incredibly accessible to learn or begin in these areas of expertise, many people having them as tools for their jobs or even as a hobby. One just needs to pick a book on CS, take courses on virtually any programming language online to, eventually, get into one of the many jobs out there that require coding. This accessibility can be attributed to the Queen of Code. One could even say that the industry boomed all thanks to her and the invention of the language that most banks still use today. In fact, I would not be studying this career right now if I noticed how easy it seemed to make the computer just do stuff by writing words.

She definitely deserves to be a role model for women and everyone, as a case of how much can one person accomplish and how many people can she affect during (and after) her lifetime.

sábado, 9 de febrero de 2019

Creating Parse Tree from 4thEntrance.lua


So, here we are, at the podcast talking about the compiler that gives the name to this blog. I hope I make it justice. I thought the podcast would talk about how a general compiler works, with added specs about gcc and how it used certain languages or features to make it as famous as it is today (and was almost 12 years ago). I was pleasantly taken by surprise when it got a whole lot more technical and went through pretty much every single step of the way, including how it dealt with special characteristics of languages like Java’s JIT compilation and C++ classes, even going as far as to explain (more or less) the structure of the GCC code. It was kind of hard to take it all in, but I think it’s worth it at the end.

I did not know much about the gcc compiler before listening to the podcast. I thought that it could only compile natively on Linux and could only compile languages directly derived from C, such as C++ and C#. When I was younger I used to use it on a virtual machine running Ubuntu to compile simple C programs. I didn’t have a lot of technical knowledge back then, so I could never get it to work properly on my native Windows system. I thought that it was just not possible, but now I know that I need a small C compiler to make it all work.

I really liked 2 things from GNU Compiler Collection. The first is how it can be so modular by just changing either the front or the back end to change language and architecture. It sounds like a Jack of All Trades, anyone can go and use the base code in order to make a compiler for a new or specific language or for a new processor. The second one is how it optimizes things, first at a general high-level by using trees and then using RTL to make it at a low-level. I think this makes things easier to plan and modify at a design point, because they clearly separate different kinds of optimizations and can add, modify or delete at any point without worrying for unplanned changes or bugs.

lunes, 4 de febrero de 2019

Syntax error: Please recompile 3rd entrance


Everyone thinks they can predict the future. I’m not saying this about programming languages specifically, but about anything, economy, technology and even something as specific or as personal as our own lives. The funniest thing of all is that we can’t seem to get it right, ever. One may think about looking at the past for hints of the future, examining our present and bet on future factors or even imagining what future we want and making it ourselves. Paul Graham seems to think that an overall trend has been present throughout technology in general and so programming languages must follow as well. I don’t disagree with him, after all, what do I know that he doesn’t? His argument seems almost anachronistic in nature; I think that is normally a good thing. However, I find some things rather, funny, in his discourse.

The most curious element is the one where it talks about parallelism and how it is used (or will be used) as later implementation details; that the most important thing is to first worry about abstractions and then, if needed, parallelism will be added later on. This seems to contradict some principles that I have learnt at school. I normally think and act upon making an abstraction first and then choosing a certain paradigm or general tactic to tackle on the problem. Multiple independent data sets with similar operations? Parallelism is the key. Modeling after the real world in groups of elements that share common attributes and behaviors? Object-Oriented Programming should do the trick. Don’t care that much about efficiency and just need to pass thoughts from your head into code? Python has you covered. Think the Python guy is a fool for believing things are easy? Maybe C is more your style. Just want to pass as one of the crowd and hope no one notices you? Boy, Java would like to hear from you.

Anyhow, to each his own. I believe, in order to make excellent programs, we need to consider very carefully our abstraction process. Then, we choose a tactic by which to tackle the problem. Finally, we analyze extremely carefully how to implement said abstraction with that tactic.