Sunday, February 17, 2019

Mother of Compilers

Mother of Compilers



As I finished reading the article and a brief documentary about her life and work, I can only say I'm genuinely surprised about all the contributions that Dr. Grace Brewster Murray Hopper gave to computer science. I had no idea who she was nor what she did in her long career.

Reading through the article I the only thing that wasn't surprising to me was the fact that she was originally a mathemathician rather than a programmer. I found this pretty common because if there is one thing all the fathers and mothers of current computing have in common is that they all have crazy logical and mathematical abilities. So yeah, I guessed since the beginning she surely was a mathematician.

However from there on, the article and the documentary were one surprise after another. I found quite funny the entry on her diary about finding the first "actual" bug in a program, refering to a moth that got fried in the computer's hardware. I cannot imagine how she had the ability to create the first compiler out of a machine that was basically arithmetic operations rather than logic-driven language.

But the thing I admired the most of her biography was the determination and passion for doing what she loved that she had and shared with everyone. Despite being "too old" to stay in the Navy at 40 years old, she proved to be too valuable of an asset to cut loose, and was recluted back at 60 years old, and apart from all of this, she kept teaching people in her spare time about programming. I envy all those fortunates who had the opportunity to go to one of her many lectures and learn directly from her. She was, without a doubt, a woman beyond her era, and an invaluable piece of computer and science history.

Monday, February 11, 2019

Internals of GCC

Internals Of GCC

In the podcast from the Software Engineering Radio, with Morgan Deters as guest, the team discuss the characteristics and functions of the GNU Project compiler known as GCC.

At the beginning I found the recording a little tedious, however as it continued I gained more interest in the subject. I learned a lot of things I never assumed with GCC, besides the fact I've used it for many projects and competitions in C. For instance, I thought it was compatible only for C programs, but I found out that it also works for C++, Java, and some other languages. 

On the other hand, it was very interesting to learn about the compiler working process: it  goes through three different phases in which source code is transformed into target code for the computer to understand it and execute it correctly.

The first phase is the frontend where a translator of the input (the source) produces an output for the middle end, which is going to use this output as its input. The front end has to parse its input into an abstract syntax tree form to create a representation of what's in the source file and send the data as it should.

Then comes the middle
 end where the output generated by the tree in the front end is used to produce an equivalent and more efficient low level representation of the input tree.

Finally, we have the backend, that takes the output of the middle end as its input to produce code. The code is optimized by replacing certain blocks and then is finally translated to Assembly Language.

Learning about this entire process was quite intersting and enlightning. I can't help but think the complexity envolved in GCC in order for it to be compatible with so many languages and platforms.


Monday, February 4, 2019

The Hundred Year Language

The Hundred Year Language

Technology is always evolving. We are continuously developing new software and hardware, with the goal of making it each time faster and more efficient. On his speech during PyCon 2003, Paul Graham talks about the future of programming languages.

I have never give it the thought of the future of programming languages. It is true that each 18 months technology's speed doubles, as the Moore's law state. However I found it reasonable that this phenomenom cannot keep on forever. There must be a natural limit to this growth, as Paul said. And as technology gets faster, that means that languages can become more simple to use to program. As they become more powerful, programmers each time have a smaller need to compact everything and use complex algorithms to make more efficient code for a computer to be able to process it given its current limits of memory and cpu power. 

 As I read through this essay, my most important thoughts were on imagining whether if optimization algorithms would play such an important role in the future of programming, or must it be the languages themselves the ones that, as the author suggested, will have to adapt and evolve, surviving and merging with other languages in the search to create a more complex and compatible one that will be able to work in the advanced and faster computers in the future.

It is amazing to think that probably we already have developed the "hundred year language", a language that is adaptable enough to adopt elements from other languages to keep active. As other languages will very likely dissappear (like Java), I do agree with the principle that we should focus on analyzing the properties of our modern programming languages in order to prevail the major ones.
One thing is for sure, I believe parallel programming is going to be a key in the future of programmig, and it will allow us to reach new levels of speed and efficiency in a hundred years.

The Hitchhiker's Guide To The Galaxy

The Hitchhiker's Guide To The Galaxy Reading this comedy and science-fiction comedy from Douglas Adams was actually pretty funny and ...