Friday, November 11, 2011

Mind blown: Storytelling, Language, Technology, the Future

My mind is blown right now. I just returned from a colloquium session at Viterbi Engineering in which Bran Ferren, Co-Chairman and Chief Creative Officer of Applied Minds spoke about some of the most shocking things ever. I only hope that I can articulate the sheer meaning and impact of what he said.
Let's start with a deceptively simple question:

What is the most important invention of mankind?

Some people said the wheel, fire (which is more a discovery), harnessing of electricity for light, tools... but according to Ferren, it is something else:

Language.

Before language, humans were herd animals. We hunted for food and lived with little organized communication with others. We could not advance our thinking, we could not grow out of our minds - we could not even begin to unleash what we were capable of.

But then language gradually took shape. It went from grunts and gestures to pictures and carvings. Soon we developed an alphabet... and over the centuries we began to unify languages on a larger and larger scale. Symbols and pictograms transformed - evolved - into what we have now.

Language became a brilliant means of storytelling, and storytelling is one of the most powerful qualities of humanity. But before we get into that, we should continue to examine history.

Every time we reinvented language in some way, a fundamental shift occurred in the way humans behave. Oral storytelling was an old means of perpetuating knowledge. What came next? Reading and writing. The invention of the printing press (which was not by Gutenberg, by the way; as Ferren says, a Chinese man invented it, but Gutenberg invented the movable metal type, and the mere fact that Gutenberg was a white European male assisted recorded history's common mistake of crediting him with printing) drastically altered the way we communicate. Now, Gutenberg died penniless because the Church and other forces discouraged literacy, and the cost of a single book from a printing machine was the equivalent of $100,000 today. Moreover, nobody in his time could grasp the implications of what he had done. However, once we could print text ubiquitously, newspapers, magazines, novels, and so many other mediums of communication emerged, and humanity faced an enormous shift in daily life.

Then came the telephone. Did you know that when the telephone first came out, people, upon picking up the telephone, would not speak into it? After all, it was an alienating technology, and there's nothing in the human genome that programs our minds to speak into the phone after picking it up. In fact, this is why we originally had operators: not to regulate interaction if there's no one on the other line, but to just get the conversation started! Thus the phone business encouraged people to get into using phones this way (otherwise you'd just have two people picking up their respective telephones and then not speaking into them...). People could not grasp the significance of this change in communication. The larger idea is that the phone, like the printing press, altered our means of communication and behavior fundamentally. It enabled humans to grow even more - to be able to envision communication without being physically present.

Both the printing press and the telephone captured storytelling in a new form. We've always been storytelling; it's hardwired into our minds. Our parents tell stories, and the most engaging ones remain with us for the rest of our lives. Cavemen and people from millennia ago told stories. And enabling new means of storytelling - or capturing language in a new way, has historically always led to a drastic change in the world. This applies to one of the most fundamental aspects of life in the early 21st century: The Internet.

The Internet, like the printing press and the phone, captured storytelling in a new form like no other. And Ferren believes that the Internet is the next most important invention of man after language. Further, he asserts some of the most astounding and mind-boggling things related to this and the whole idea of language... perhaps most notably the following few:

The computer revolution has not started yet.

Reading and writing are just abstractions that will eventually be nonexistent.

Whenever we rethought language, a fundamental shift in humanity occurred. 

During the times when we did reinvent language, nobody could grasp the significance of what was going on.


It's so difficult to even begin to grasp the implications of these ideas.
What does this all mean? What is going on? Ferren believes that to invent something that profoundly impacts humanity means (1) nobody in your time is going to follow the significance of what you are doing and (2) a change in how we tell stories is critical. This is a rudimentary explanation of the larger picture, but I just found what he said beyond stimulating. There's more... (at this point, I'm just trying to get down my jumbled thoughts into words):

1) There are "big idea people" and "requirements process" people. They usually do not get along. However, they must get along for any big idea to become real.
2) We must take seriously our education - and not just for ourselves, but for future generations. The ability to pass on knowledge to future generations is essential to not only humanity's advancement, but also humanity's survival.
3) To make something real, you need...
    a. Vision
    b. Trust
    c. Simplicity

Regarding simplicity, he explained to all the engineers there that engineers tend to make things more complicated then they should be. He cited an example with space and writing. Someone thought that pens wouldn't work in outer space because pens depend on gravity for the ink to come down. So, companies poured millions into developing a pressurized pen that would work in space. They did it alright. And the pen was (and is) pretty cool; it works. But Russia just used pencils. Seriously - keep things simple...
 
Everything I wrote boils down to the key ideas that storytelling is critical, that already we should see that the usage of words - writing, reading - will eventually be outdated, and that future changes in the mediums for storytelling will be drastic and world-altering. We should expect some new technology to allow our minds to communicate by some other means... words can't begin to capture true expression.

Yeah. That's clearly the case in the mess of words I wrote above.

[Aside]
He mentioned one possible, simple idea of changing the medium of language that is writing. Colored words could indicate emotions... like red for anger. I thought of simple universal symbols along with writing to indicate emotions...
He also mentioned that we naturally envision things in 3D space. The idea that a picture is a thousand words is useful, as words take forever to describe things, but a 3D experience of something captures things even more...
Hmm... I seemed to have left out the impact of Steve Jobs and the iPhone. Ferren mentioned this. The iPhone essentially reinvented a medium of storytelling by placing a screen on a phone. Not a big deal when people are used to it, but imagine the shock when it first came out... and imagine the shock of the next new medium of storytelling...

Friday, November 4, 2011

Specifying miscellaneous attributes in a Rails text field

Quick code tip: Specifying miscellaneous attributes in a Rails text field is easy.

Here's an example in which you set autocomplete="off":

<%= text_field_tag 'textfield', nil, :autocomplete=>"off" %>

The idea is to use "nil" as a placeholder for the middle value. In this case, the middle value for text_field_tag is value, which is what normally fills the text field.

If you didn't specify nil as a placeholder for the function, you would get a text field that said {:autocomplete=>"off"} inside. Probably something to avoid :)

Saturday, September 24, 2011

Setting optional local variables in your partials

Rails' partials are great - but sometimes, it's a hassle maintaining them.

For instance, haven't you ever changed the name of a partial, and consequently had to go find all your renderings of that partial and tediously adjust the name?

Or, have you ever decided to add a local variable requirement to a partial, and then have to, again, find all your renderings of that partial and then adjust the :locals => { :some_variable_I_now_have_to_set => "some value that shouldn't exist for most partials" } appropriately?

I don't have a solution to the first one (though I certainly wish I did), but I do - hooray! - for the second.

Basically, if you're thinking of doing something like this in your partial file...
<% unless optional_local_variable.nil? %>
  
<% end %>
...you'll encounter an error saying that "optional_local_variable" is not defined, which is quite disturbing, considering that you wanted to check if it was nil to begin with.

So what do you do instead?
<% unless local_assigns[:optional_local_variable].nil? %>
  
<% end %>

Voila. What does this mean you can do? Here's a more complete example:

books/index.html.erb:
Books
<%= render :partial=>"partials/book", :locals=>{ :books=>@books } %>

partials/_book.html.erb:
<% unless local_assigns[:books].nil? %>
  The 'books' variable was passed.
<% else %>
  The 'books' variable was not passed.
<% end %>

stores/index.html.erb:
Stores - Featured
<%= render :partial=>"partials/book" %>

The stores/index file's partial call doesn't require the local variable that was passed into the partial call in books/index. Useful.

Basically, if you want to make the specification of a local variable in the render :partial call optional, use local_assigns[:variable_name].nil? This also checks to see that variable_name is nil if you specify :locals=>{ :variable_name=>nil }.

Thursday, September 22, 2011

When someone realizes your idea before you

What should you do when someone makes the thing that you've been dreaming of creating for years?

Smile & be genuine.
Try to make it better.
Ask yourself: Does it really make what you wanted? You probably wanted something different...
Just relax... go about your ways, discovering, embracing.

If it helps, treat life as a discovery process.

Sunday, September 18, 2011

The Stigma of Video Games

I don't play video games much. Honestly. I remember a time when I was addicted to a few games, but into high school and college, I stopped playing entirely simply because of countless other activities I deemed "more important."

But since I'm at the University of Southern California, which has (apparently) the best game design program in the country, I began pondering video games more. I started considering the stigma video games have and how gaming is similar to reading, a highly praised educational activity.

Someone who reads a lot gains credibility. He becomes a knowledgeable person. But someone who plays video games a lot doesn't gain this same credibility, although he no doubt learns things too.

Is this right? Should gaming have such a stigma?

Reading and playing video games share many similarities. You learn. You "get lost in a different world." You, potentially, become isolated from others. (In the case of reading, you might be reading in a group, and in the case of video games, you might be playing with others, but often, reading or playing video games would isolate you from surroundings.) Your mind is engaged and focused. They're similar...

I remember a conversation I had with a good friend about how video games should be viewed as credible an element to society as literature. So many novels have literary merit, and the educated members of society are proud to have read such novels. But games? What if games had "literary merit" too? They're so complex - perhaps even more so, what with the direct interaction with people. What if we could view the various elements of a video game - graphics, storyline, themes, meaning, representation of life, music, and more - and appraise the game just like we appraise literature?

If games were clearly educational, and if they didn't have the mild stigma they do, what would society be like?

#rambling thoughts...

EDIT ===
Here's an interesting link about video games being the next great art form: http://www.salon.com/books/feature/2010/06/20/tom_bissell_extra_lives_interview_ext2010

I got to think more about this. Perhaps the usage of the word "game" in "video games" limits our understanding and ability to see that video games are more than that. Ever read 1984 by George Orwell or Outliers: The Story of Success by Malcolm Gladwell? If you have, you'd know that limiting language/vocabulary influences thinking. By having the word "game" associated with video games, people have a harder time grasping the artistic and "literary" value of video games. This is probably because we associate "game" with so many childhood things - tic-tac-toe is a game, and so are board games, little puzzles...

Maybe we need a whole new term for video games.

Friday, September 9, 2011

Education, Adulthood

I just went to an outstanding lecture by Cathy N. Davidson and hosted by the University of Southern California's Visions and Voices program. Her lecture was about the science of attention, and how it transforms and should transform "the way we live, work and learn."

Although many great ideas emerged from this lecture, there were several that really hit me about education.

She asked the crowd about the history of the multiple choice test. Have you ever thought about that? This type of test has just become so classic that most people don't even question its existence.

Anyway, she asserted that the inventor of the multiple choice test actually made it ad hoc to deal with the many immigrants of the early 20th century. The key phrase in that sentence is "ad hoc." This test was only meant to handle the emergency of too many immigrants! The inventor had no idea - and especially no desire - for the test to be adopted by colleges. The inventor, in fact, was horrified that his invention became what it has, and he spent much of the remainder of his life denouncing his creation. Even when he held a prominent position at the head of a university, he advocated getting rid of this test. And unfortunately, he was fired because of this.

The problem with the test is obvious: it tests limited knowledge. It narrows the playing field and eliminates direct application to real life. Of course, it's convenient for testing little things - facts, figures, etc. But for higher education, it's a terrible limitation. These tests are only good for lower-order skills, and lower-order skills are insignificant as students progress through high school, college, and beyond. Ultimately, these tests fail for the journey to adulthood.

Adulthood. That was another part that the speaker made an interesting note about. She explained how, on the last class of a course she once taught, she gave the students a simple task: Write down a few questions you want to ask me (i.e. her) on this last day we have together. Then, she said she was going to leave the room and let the class compare, altogether, what they wrote, because in the end she was only allowing one question. The students had to discuss their ideas, negotiate, and agree on just one question.

When she  returned, she saw that the students had a smirk on their faces. It was evident they had a good question. And when they asked, her breath was taken away:

"How do we become adults?"

Education prepares us so superficially for the real world. So many students know the books and know how to conquer the multiple choices and know how to snag the "A" in class. But what happens when you graduate? What happens when you finally face the real world, the world that our superficial educational system fails to prepare us for?

Now I ask my own question: How could these students come up with such a profound question to ask? The collaboration environment the teacher created by leaving the room and letting the students figure things out was powerful. Anyone can conquer a system; like I said, anyone can grab that "A" in class, or master those multiple questions. But to really learn, we need the interaction that has brought mankind from hunter-gatherer to farmer to scholar to... today.

Collaboration - and the lesser role of the teacher - is a huge idea that education should implement. The traditional establishment of the teacher standing up in front of the class, presumably the master of the subject at hand, while the students sit in rows facing the teacher is a failed concept that should soon be wiped away. It precludes the collaboration that is so essential in the real world and lays the cornerstone for false beliefs that the teacher "knows it all."

Alright. I wrote this post in a rush to let my ideas flow from my mind. Still, I needed to write this because the lecture was really moving. I definitely left out a lot, but I wanted to mention what I have. Before ending this, though, I want to share a few statements that, at least to me, capture key ideas that have changed the way I think. These are phrases that came to mind while listening to this great speaker.

"The multiple choice test should fade away."
"Institutions are difficult to change."
"Remove the teacher and let the students explore."
"Take away the grading and let the real learning begin."
"How do we become adults?"

Friday, September 2, 2011

Google+'s Age Restriction

Google's expanding strategy with Google+ is incredibly interesting.

Restricting the age minimum to 18 has encouraged the adults of society to latch on - major politicians, established workers, and more - something in which Facebook is lagging, what with the mild social stigma behind an adult using Facebook (e.g. it feels awkward when you get a friend suggestion for your best friend's dad, and it's admittedly a bit odd seeing your high school teacher on Facebook).

Further, this strategy encourages anticipation from those below 18 to get a G+ as soon as they turn 18.

I personally find it even more intriguing since it's not only the age at which someone is officially an adult and can participate in contracts with Google; it's also, roughly, the age at which a high school student becomes a college student, which is a major transition socially in that student's life.


Wednesday, August 31, 2011

Great things are created "on the side"

Let's pretend that you hold a full time job, or you're a college student absolutely loaded with work daily. This is normal. You're like millions of others in the world...

...except, you have these ideas you want to realize, or this personal project you really, really want to finish.

You constantly face this dilemma: work and work and work on non-personal project matters, or allot time to work on those ideas you have. The easy answer is to go with the typical work society asks of you - finish your college studies, try for grad school, aim for a good career, try to get that internship, please your boss, stay late after work - the list goes on and on. While none of these are necessarily "easy," they certainly are the standard way to do things.

But let's face it. If you're going to realize those ideas of yours that you're truly passionate about, or even those that you simply think are "cool" (which in fact could be better than the ones you're passionate about) you must allot time - even a little - to your project(s). This really makes all the difference.

Now, they say that if you really want to do something, you'll be able to allot time for it. But in this busy world, remembering this saying often just isn't enough to drive you.

Instead, take this advice: great things have been created "on the side."
Instead of even considering the idea, "Okay, self, I'm going to work on Project X every day from eleven to twelve midnight no matter what," it's far easier and more productive to simply go about it unthinkingly. Planning and setting explicit timeframes puts an unnecessary burden on you, one that you just won't keep up with if you're a fulltime college student or you're working at a major company.

If you instead just work on your little project "on the side" while doing the typical things society asks of you, you'll find yourself creating something you truly value and find useful. Examples from history that were also created "on the side" include the creation of...
(If you have other examples, comment and let me know!)

Bottom line: it's okay to work on a project "on the side" without designating strict deadlines and times to work. In fact, it might even be better.

Quick, last thing: working on a project "on the side" helps alleviate unneeded pressure you put on yourself if you really consider your ideas "big ideas." I personally am starting to see that it's easier to remove the weight you put on your ideas and instead go about creating them with a free and open spirit.

Read another interesting article about jumping on inspiration.

Saturday, August 6, 2011

PGError: ERROR: cached plan must not change result type... Solved

I got this error recently after deploying to Heroku:

ActiveRecord::StatementInvalid (PGError: ERROR:  cached plan must not change result type

It turns out the issue was just like that of this person: http://stackoverflow.com/questions/2783813/postgres-8-3-error-cached-plan-must-not-change-result-type.

Basically, I ran a migration / modified my database table, but didn't restart the heroku app. So, after running

heroku run rake db:migrate

I had to also run

heroku restart

and the problem was solved.

Why do we have to do this? According to this site on Heroku, "After running a migration you’ll want to restart your app with heroku restart to reload the schema and pickup any schema changes."

Thursday, August 4, 2011

Rails' link_to: specifying miscellaneous attributes

If you ever need to specify miscellaneous attributes, as I did when making my Rails 3.1 app mobile with jQuery Mobile, there's an easy way to do it.

Example: I want to create a link with the following...
<a href="" data-direction="reverse" data-rel="back" data-prefetch>xyz</a>

Just do this (in your embedded ruby html file):
<%= link_to "xyz", my_path, "data-direction"=>"reverse", "data-rel"=>"back", "data-prefetch"=>"" %>

The order of the variables after the path specification (my_path, in this case) generally does not matter, so you could put :class => "my_class" before, in the middle, or after those "data-" parameters too, and you could rearrange the "data-" parameters however you like.
Also, note the specification of "data-prefetch"=>"" in the case that there is no attribute to set equal to.

This whole idea came especially in handy for me because jQuery mobile uses such things frequently. However, the idea of specifying
"some_variable"=>"some_value"
in the link_to method can be applied to broader scenarios, too; it just might help you out.

Wednesday, August 3, 2011

Passing a path into a partial

I recently figured this out, and I thought others might find this helpful, too.

Ever want to pass a path into a partial - say, for a header, where there's always going to be a link in this one spot, but the actual path of that link changes? I encountered this problem when developing a mobile web app, where in the top left header section, I always wanted a "back" link, but the actual path of this link varied depending on the page.

Here's how I did it:
In the partial:
<% unless path.nil? %>
 <%= link_to "Dining Hall", send(path) %>
<% end %>
In the view that renders the partial:
<%= render :partial => "layouts/partials/header", :locals => { :path => "root_path" } %>

Instead of root_path, you could put, for example, books_path. Note, however, that passing "@book" or @book does not work, so this still has its limitations.

The whole trick is using the "send" method in the link_to. http://apidock.com/rails/ActiveRecord/Associations/AssociationProxy/send

I certainly found this trick to be useful, and perhaps you will too.

Saturday, July 30, 2011

Reducing Slug Size (Heroku)

If you check out http://devcenter.heroku.com/articles/slug-size, three tips are offered to reduce slug size:
  1. Move large assets like PDFs or audio files to asset storage
  2. Ignore files which are unnecessary to run the app. For example, unit tests, PSDs, or large design documents.
  3. When possible, reference a released gem by name in your Gemfile rather than loading it from source using the :git option.
Let's first look at #3.

I originally had the following in my Gemfile:
gem 'rails', :git => 'git://github.com/rails/rails.git', :branch => '3-1-stable'
gem 'sprockets', :git => 'git://github.com/sstephenson/sprockets.git'

When I changed this to
gem 'rails', '3.1.0.rc4' # :git => 'git://github.com/rails/rails.git' ...
gem 'sprockets', '2.0.0.beta.10' # :git => 'git://github.com/sstephenson/sprockets.git'
my slug size changed from 63 MB to 61 MB (specifically to 61.6 MB). But as Heroku recommends,
The maximum slug size is 100MB. Most apps should be far below this size. Anything under 10MB is good. If you exceed 40MB, you should think about trying to reduce the size of the slug.
...so I better keep going.

I moved on to #2. I created a .slugignore file with the following contents:
features
test
spec

In my case, just ignoring these folders in my app reduced my slug size from 61.6 MB to .... 61.6 MB. Negligible difference, unfortunately. I'm suspicious that adding the .slugignore didn't make a difference because when I set up this same app again at a different URL on heroku, the slug size was only 20.5 MB.

After filing a request to Heroku (https://support.heroku.com/requests/27593), I learned that, as the support staff member said, "There is currently a bug in bundler that is not cleaning old versions or unused gems out properly. We are working on upstreaming a fix for this, so hopefully sometime soon your slug size will lessen automatically with a push."

Guess it's back to waiting then. Nonetheless, I hope you found my little investigation above interesting if not helpful.

Tuesday, July 26, 2011

Trick to tell when a website was last updated

Some of you might find this useful... I certainly do when I want to check how up-to-date a website is.

Put this in the URL bar at the top, and hit enter:
javascript:alert(document.lastModified)
This will tell you the date and time (Greenwich Mean Time) the site was last updated.

This is nice when you're citing online sources and you need the date the site content was last changed, or if you simply want to know if a site's content is likely to be relevant today.

Comment if you find this helpful :]

Thursday, July 21, 2011

Using PostgreSQL as your database for Ruby on Rails

The following blogpost does an outstanding job of explaining how to set up PostgreSQL for Ruby on Rails development on OS X (for Rails 3).

https://willj.net/2011/05/31/setting-up-postgresql-for-ruby-on-rails-development-on-os-x/

However, there is one part I want to draw attention to:
$ createuser shawsome
Shall the new role be a superuser? (y/n) n
Shall the new role be allowed to create databases? (y/n) n
Shall the new role be allowed to create more new roles? (y/n) n

He says that it's okay to specify "No" for each of these, but you'll face a testing problem later:
$ bundle exec rake db:test:prepare
PGError: ERROR: permission denied to create database

Instead of replying "n" to "Shall the new role be allowed to create databases?" you should have replied "y" (yes) because using Rails' testing frameworks inherently requires that the test db is destroyed and regenerated frequently, as is shown by the existence of the rake command "rake db:test:prepare".

So what can you do now to fix the role? Instead of retracing your steps and making a new user with proper permissions, just change it via postgres:

1) Connect to the standard "postgres" db of your postgres installation (replace "your_username" with your username):
psql -U your_username postgres

If you're not sure what your_username is, run the following; it's the name under "Owner" across from "postgres":
psql -l

Note that not specifying your username might let you connect, too:
psql postgres

2) Run the postgres command:
ALTER ROLE that_username CREATEDB
...where that_username is the username of your Rails application.

That should do it. If interested, read on about useful postgres commands.

Wednesday, July 20, 2011

A social network convo: Google+, Facebook, Twitter

I had this post on Google+ a while back that led to some interesting discussion on social networks.
The original, on Google+, is right here if you want to check it out.

The key parts went like this:

Me: The more I think about it, the more it seems like Google+ will be comfortably ensconced between Facebook and Twitter. Facebook is heavier; you're less likely to babble about random things on there - each post you tend to put more thought into. It also has that exclusive and private feel to it - just me and my friends. This is relative to Twitter, where you tweet whatever the !$@# you want; it's like a shouting affair. Plus the fact you can follow nearly anyone...

G+ puts all this together with the ability to follow and group into circles. The lack of a wall makes you a little more comfortable to post whatever, whenever, without considering the "other people's posts on my wall vs. my posts" ratio.

Ah and I think Instant Upload is a big deal... and the (so I hear) greater privacy controls over photos. The easier privacy controls actually is a huge.... plus.


Friend (male): Interesting - I would actually rearrange that to say that Facebook sits between Plus and Twitter, in that status posts and such are seen by all "friends," regardless of if they're a "friend of 10 years and counting" or a "friend that I saw once in class." For me that exclusive and private feel to it is more readily manifested by Plus's control via circles, than Facebook's control via friending. Ie., you can mimick Facebook in Plus by putting everyone into the same circle, but it's a lot harder to mimick Plus in Facebook, and going down a step you can mimick Twitter in both by again putting everyone in the same circle in Plus and barfing your brains out in discrete little chunks of 140 characters, or doing the same while accepting any and all friend invites in Facebook, but you can't go back up the ladder from Twitter to either of the two.

Friend (female): In Facebook, you can filter your friends into groups so certain people see certain things, while others don't...which is kind of similar to the circles idea, IMO.

What I would like to see in + is the trending stuff in twitter.....because twitter is cool like that.

Friend (male): Granted, but because Plus is entirely built around the concept the interface is a lot smoother and intuitive - a merely aesthetic point, perhaps, but it makes a difference. The concept is made salient in Plus, while I would say that very few people in Facebook go to the trouble of compartmentalizing to such a degree. At the moment I see one of Facebook's largest advantages is its enormous user-base (world's third largest "nation" :P) - this difference may be what will allow Plus to siphon people away.

Also, more generally, I see Google's expertise in search being very easily implemented, while Facebook's search functions really aren't much to speak of.

Me: +[insert name] though Facebook's list organization is far more difficult to use than Google's Circles;
+[insert name] I see what you mean. I feel that the ability to draw from both types of social networks gives me the image of something like this:

Facebook <===> G+ <===> Twitter
More formal; flexible formality; informal
Many people, loosely categorized; categorized people through and through; not necessarily ppl
Meaningful posts; any kind of post; %$#%^ posts (tweets)

... eh, it's all relative anyway. It probably ties more to how I personally use social networks.

Friend (male): Yeah, I think Facebook isn't nearly as "formal" to a lot of other people as it is to you - though one interesting perspective is to consider the GUI. Question: between Windows XP and Mac OS X, which seems more "formal" or "professionsal" or "technical?" For me, based purely on presentation, I would say XP, if only for the reason that XP looks more linear and squareish, while Mac OS X is softer and more elegant. Similarly between Facebook and Plus, do you think there could be an influence of presentation? Facebook is familiar, structured, and linear, while Plus is open and sparse, and graphically less rigid.

Me: YES. I was actually about to mention design. Facebook's design has that blue bar at the top, which lends to a "rigid" or "restricted" feel (at least for me, probably for a good number of others deep down). Also, while not terrible, the constant ads on the right sidebar, or actually both sidebars combined contribute to the barred feeling of the site. After visiting Facebook for years, I admit the comfortable feeling of the site wears away with these subtleties... albeit only for brief moments when I'm surfing.

Google+ has such a clean, smooth, minimalist layout that the feeling of flexibility really comes through. It can morph into anything = how I think of G+. In fact, google.com has always been my home page, and part of the reason is the absolute emptiness of the page. It just, for me at least, clears your mind. Helps you relax a bit, even. (You know, with all the clutter on the web and in life.)
Between Windows and Mac, I feel Windows is slightly more formal - partly based on its philosophy behind maximizing windows, partly based on my experiences visiting software companies showcasing their software - there'll all on windows. Windows also gives me a "laborious" feeling that's difficult to describe. Oh - and the key thing that undermines the formality of Mac (which isn't a bad thing) is the philosophy behind having many windows open at once with corners and bits of sides peaking out. (Goes back to the philosophies behind maximizing: http://www.forevergeek.com/2006/09/mac_vs_windows_its_all_about_the_maximize_button/)

Technical, though, I wouldn't push too much to the Windows side - the Unix OS within Mac is actually a large reason I switched. I needed it for RoR programming, and I know all [good] Rails devs say Mac rules and Windows.. doesn't.


Friend (male): Personally I don't really mind the ads on Facebook - they're there, and they take up space, sure, but not an inordinate amount, so they become just another part of the framework that I don't need to pay attention to.

And yeah, I've always had google as my homepage as well, until I switched to Chrome haha (now my "homepage" is just that "new tab" page).

Haha, I won't get into Windows vs. Mac - that's a debate that never ends. And really - in the context of programming, given that you can use even notepad to program, albeit more slowly, it really doesn't matter what OS you're on. In other contexts such as gaming or graphic design, there are much clearer advantages. I intended it as purely a visual comparison - though your article on "maximization" as an indication of the workflow philosophies is interesting. I'm not sure how much I believe about how strong of an indicator it is, but it's an interesting perspective.

Tuesday, July 19, 2011

That's not what I said!

Ever get in one of those arguments in which you (or the other person) is like, "That's not what I said!" or "I never said that!" in a tone of rising fury?

The retaliation tends to be, "Yes you did! You definitely said [some word here]!"

From experience and... well.... musing, I believe that in 99% of these arguments, the listener is right. Why?

Think about when you are speaking to someone. All you want to do is convey some meaning to this other person, so you're focused primarily on the meaning. More often than not, the actual words you say aren't registered in your own mind nearly as much as they are to the listener, who must absorb the words first and then extract meaning.  I see it similar to...

Speaker ==> Meaning he wants to convey ==> words he tries to use to convey it ==> Listener

The listener is closer / more focused on the words because he is trying to grasp them entirely so he can get the speaker's meaning. Think about it. The speaker is often so entrenched in the meaning that he may "let things slip" from his mind quite easily. This all goes to explain why I feel the listener is correct in most of these arguments.

Quick example from my life. My five-year-old sister is talking about trophies - she has a little trophy from kids' gymnastics - and in a quick talk she says that she wants her own place to put all her trophies (cute little ambitious kid, right?). From the depths of my heart, I dislike the extrinsic and superficial values of trophies, so I mutter that trophies are stupid. Yep. Slipped right out of my mouth, and I didn't even realize it. My sister says, "Noooo..." and grabs her little trophy and says something about how pretty it is, and I, then more engaged, say that trophies are "superficial" and don't really matter.

Then it happens.  My sister says they're not stupid; I say I didn't say they were stupid; she says yes I did; I say no I didn't; she insists; I insist... the petty argument ends with my simply turning away and going back to work, telling her to be off with some activity.  I reflect on the event later, though, and I realize that... who was right?

The five-year-old.  That is, the listener.

Monday, July 18, 2011

From Windows to Linux to Mac

For over a decade I've used Windows. Then, sometime during junior and senior year of high school, I started using Linux - specifically, the Ubuntu 10.x distribution. Finally, the summer before I enter college, I made the long journey to Mac.

Why did I switch? What are my experiences and thoughts with each platform?

Initially, I needed a Unix-based machine for programming work - i.e. Linux or Mac. I thought going to Mac would be too large a jump, so I installed an Ubuntu distribution of Linux on an old Windows laptop. Experimenting with that and doing work on it for a good year was nice, but the laptop was just sooo old, and I eventually had the opportunity to get an entirely new laptop. After looking into Mac - particularly learning the keyboard shortcuts and philosophies behind it - I decided it was the right choice. I didn't want an unstable, "risky" machine like Linux, and I had since realized how unelegant Windows was. I got a Macbook, and I've never looked back since.

That's the quick synopsis. There are lots of miniscule details that played an enormous part with each system. Since Linux never really got a big hold, particularly because I felt there was just too much freedom with what you could do to it (so using it could be dangerous), I'll distinguish Windows and Mac.

1) Maximizing:

I used Windows ever since I was little - so that's for ~10 years. Its philosophy behind maximizing is simple: click the maximize button in the top right, and the window's widths and lengths stretch until they reach the edge of the screen.

Mac's philosophy is different. On computer screens, it actually looks ugly to maximize a window all the way as Windows does because more often than not, excessive white space shows, and the window consequently looks unappealing. So Mac toggles between two sizes: (1) the "best" size of the window to display as much of its content as possible, and (2) the smaller size so you may place other windows side-by-side. This actually makes more sense, and it makes the screen far more elegant to look at. For a more thorough explanation of this, check out http://www.forevergeek.com/2006/09/mac_vs_windows_its_all_about_the_maximize_button/.

2) Minimizing

Windows basically lets you store open windows at the bottom bar through minimizing. Alt + Tab lets you toggle between all active windows, regardless of whether the window is minimized or not.

Mac, en revanche, considers minimized windows as an isolated group. Once minimized, windows can not be switched to via Command + Tab (the Mac equivalent to Alt + Tab). Minimized windows on Mac are "stored away" - often so you can reopen them in different spaces or just deal with them later. Personally, I like to reactive minimized windows by doing the four-finger swipe down on the trackpad (or through whatever means to activate exposé) and then selecting the minimized window (it'll always be at the bottom, in the minimized windows section).

That said, Mac also provides the hide option, which hides all windows associated with the active program. So if you have various Google Chrome browsers open, Command + H will hide all of them. They don't appear in the minimized windows section, and activating exposé won't show them; they're hidden. To reactivate them, I just Command + Tab to the program. This hiding feature is nice when you just want to hide from your mind all windows related to an application without exiting the program.

3) Usage of clicking

Perhaps one of the biggest difficulties with switching from Windows to Mac was the difference in using clicking. In this post, I describe how I disliked the extent to which so many Mac users tediously used the mouse to navigate all the way to whatever they wanted to click when they could just use a shortcut to get there a million times faster.

First, I realize now it's not that bad to use the mouse like so; in fact, using the mouse for things actually helps users' stamina when using a computer. That might sound weird, but think about the times your hands get cramped from using Windows doing all this intense work for hours... if the mouse had been used, something comfortable albeit a bit slower, you'd be more relaxed. It really helps keep users in a calmer, more comfortable state while working on a computer. Second, if you check out the rest of that post, there are actually many, many shortcuts that you can set on Mac. It's just a matter of having the knowledge to use them.

4) Design

Undoubtedly, I find the Mac design more comfortable. In spite of nearly a decade's worth of using Windows, after barely half a season, I can feel the beauty of Mac each moment I use it. It helps me feel inspired and comfortable in doing my work. So there's a reason - a worthwhile reason - Mac is more expensive; it truly is far more elegant and user-friendly.

Specifically, let's break down the situation in which an application is active. On Windows, an active application is simply the window that it's in. You could have many Chrome applications open right at this moment. On Mac, when an application is active, you see its name in the menu bar at the top next to the Apple logo. On Mac, an application could be active and still have zero windows. This definitely changes things around a bit. On Windows, when you close the one Chrome window you have open, you are effectively exiting the Chrome application. Not so on Mac. The menu bar still shows Chrome. Command + q is necessary to exit the program. I, however, find it nice to leave an application active although no windows for it are open so you can quickly open a window when it's suddenly needed.

5) Resizing

One complaint some Windows - and sometimes Mac - users have when working on a Mac is the fact that you can only resize a window by clicking on the bottom right corner of it. I've thought about this, and indeed, some people definitely dislike this (http://forums.macrumors.com/showthread.php?t=203227). But let's pause for a second. Look closely at any window on a Mac, and you'll realize that it has shadows. It leaves shadows in the background, which is simple yet ingenious. Further, look at the edges. There's no thin border like there is on Windows. In sum, Mac sacrifices a little functionality for further elegance. Some might think it's not a worthy tradeoff; others, the contrary.

Alright. Pause. I guess you could still have the ability to drag windows from any side even with the kind of window design Macs have, but there's a silent benefit to lacking such functionality. I've noticed that on Mac, I spend time resizing / adjusting window sizes far less - partly because of the maximizing philosophy of Mac (see 1. Maximizing), and partly because of this only-the-bottom-right-corner thing. I think this is good - seriously. Users aren't as distracted or fidgety about adjusting window sizes, and productivity rises. I know, it may seem insignificant, but I really feel far more comfortable. I rarely ever feel the urge to resize a window now, and when I do, I pause and ask myself if I should bother going all the way to the bottom right corner and dragging or if things are elegant as is. Usually, it's the latter.

6) Folders, Applications

Apple likes to have a library of files all bunched together, put in one inexplicable but convenient file. This is true for iTunes (with its iTunes library) and iPhoto (with its Events). Windows users, including me previously, are probably more accustomed to complete control over files and their locations in folders. While you can do the same on Mac, albeit with slightly fewer common keyboard shortcuts (does anyone know how to cut and paste in Finder with keyboard shortcuts? Seriously.... do we really have to do stuff like this? http://lifehacker.com/5622046/cut-and-paste-files-in-os-xs-finder-with-automator-services), this other concept exists, and it took me a while to get used to: To add a file to iTunes, it must be in the iTunes library. Similarly, to add a file to iPhoto, it must be in Events.

Basically, the iTunes library and Events for iPhoto are like "general storage areas" that have to store everything before you can organize it or do whatever you want. Wrapping my head around this was odd as I came from a decade spent on Windows, but once I did, the whole deal with file management was much easier. I also find everything much more elegant.

So.

These are just a few highlights of the differences mixed with my thoughts on usage. I hope you find some of them beneficial.

If you have notions regarding Mac and Windows (or Linux), feel free to Leave a Comment Below :).

Saturday, July 16, 2011

Trick to tell the number of links, images, or forms on a webpage

I have no idea WHY you might want to do this, but you can. :]

For each of these, put the following in the URL bar at the top, and hit enter:

To find the number of links...
javascript:alert(document.links.length)

To find the number of images...
javascript:alert(document.images.length)

To find the number of forms...
javascript:alert(document.forms.length)

Friday, July 15, 2011

link_to with image_tag

How do you use Rails' link_to helper with the image_tag helper in your views?

Easy:
<%= link_to (image_tag "my_logo.png", :size => "150x50"), '/' %>

Just surround the image_tag part with parentheses and place it where you'd typically put the text to display in your link_to.

Mac shortcuts

After making my switch from Windows to Linux to Mac, I realized how helpful it could be to others to highlight a few key Mac features that I personally find essential. It always bothered me how most other Mac users I see use the mouse just too much. They move it all the way to the menu bar, click, and then move back down the list to select something... when they could just use a keyboard shortcut to get it done instantly. I guess it's a productivity thing, and unproductivity has always been one of my pet peeves.

On Windows, many users like to use the keyboard to tab from button to button and then hit enter to select whatever they want. Mac users seem to enjoy moving the mouse all the way to the button and clicking, and I have to admit this was one of my biggest turnoffs with using a Mac. But I knew there just must be a shortcut, and when I found it, boy was I glad:

*If a button or icon in a form or pop-up menu is highlighted with a blue border, hitting the spacebar is equivalent to clicking that button.

*If a button is entirely blue, hitting enter is equivalent to clicking that button.

*If you want a shortcut to "cancel" in a pop-up dialog, hitting the escape key will do the trick.

*If you want to tab between all controls, open System Preferences -> Keyboard -> Keyboard Shortcuts and look for the words "Full Keyboard Access" located at the bottom of the menu.  Normally, "Text boxes and lists only" is selected, but selecting "All controls" lets you tab among all controls.

These were just a few key things that made me more productively use my mac and helped me transition from Windows and Linux.

More info: http://support.apple.com/kb/ht1343.

Monday, July 11, 2011

Sass partials with Rails 3.1 - use .css.scss

I tried importing partials in Sass for a Rails 3.1.rc4 app today, but for some reason, variables just wouldn't be recognized although they definitely existed in the partial.

The key thing is that your partials have to have the .css.scss extension - not the .scss extension as is shown on the Sass website (http://sass-lang.com/tutorial.html) explaining partial usage.

Here's an example of Sass used in my app:
assets/stylesheets/admin/meals.css.scss:
// Place all the styles related to the admin/meals controller here.
// They will automatically be included in application.css.
// You can use Sass (SCSS) here: http://sass-lang.com/
 
@import "../partials/standard_list";
 
body {
 #meals_index {
  #meal_new_area {
 
  }
  ul#meal_list {
   @include standard_list(meal);
  }
 
 }
}

assets/stylesheets/partials/_standard_list.css.scss:
@mixin standard_list($resource) {
 $light_green: #52D017;
 
 list-style-type: none;

 li {
  ul li {
   display: inline-block;
  }

  ul li.name {
   position: absolute;
  }
  ul li.controls {
   position: relative;
   margin-left: 300px;
   width: 200px;
  }

  ul##{$resource}_list_headers {
   li { 
    font-weight: bold;
   }
  }
  ul.items_within_#{$resource} {
   li.name {}
   li.controls {}
  }

  ul:hover { background: $light_green; }
 }
 
}

Sunday, July 3, 2011

Running multiple tags for Cucumber

If you want to run multiple tags at once, run something like this:
cucumber -t @a,@b
Note that there should be no space after the comma.  The sole separator is the comma.

See this site for more info.

Saturday, July 2, 2011

"Missing template" error with RSpec... Solved

If you ever get an error like...
Failure/Error: post :destroy, :id => organization.id
     ActionView::MissingTemplate:
       Missing template superadmin/organizations/destroy, application/destroy with {:handlers=>[:erb, :builder], :formats=>[:html], :locale=>[:en, :en]}. Searched in:
         * "#"
...when you run a spec test for a controller, and your routes are definitely fine and there shouldn't be a template for the action, you just need to specify a redirect_to or render.

Thursday, June 30, 2011

undefined method `hash_for_... Solved.

Ever get a nasty error like this?
And I am on the controls page                        # features/step_definitions/web_steps.rb:44
      undefined method `hash_for_new_admin_organization_path' for # (ActionView::Template::Error)
      /Users/daze/.rvm/gems/ruby-1.9.2-p180@rails31/gems/actionpack-3.1.0.rc4/lib/action_dispatch/routing/polymorphic_routes.rb:133:in `polymorphic_url'
      /Users/daze/.rvm/gems/ruby-1.9.2-p180@rails31/gems/actionpack-3.1.0.rc4/lib/action_dispatch/routing/polymorphic_routes.rb:140:in `polymorphic_path'
      /Users/daze/.rvm/gems/ruby-1.9.2-p180@rails31/gems/actionpack-3.1.0.rc4/lib/action_view/helpers/url_helper.rb:111:in `url_for'
      /Users/daze/.rvm/gems/ruby-1.9.2-p180@rails31/gems/actionpack-3.1.0.rc4/lib/action_view/helpers/url_helper.rb:242:in `link_to'
      ./app/views/admin/organizations/index.html.erb:4:in `_app_views_admin_organizations_index_html_erb___2726815532635892939_2153996520'
No worries.  You probably just forgot something in your routes.rb file.
In my case, I had
<%= link_to "New Organization", new_admin_organization_path %>
in my view.  Sure enough, I hadn't specified the new admin organization path in my routes file.

Solution: put the following in my routes file:
namespace :admin do
  resources :organizations
end
If you find this helpful, feel free to comment below :].

Wednesday, June 29, 2011

Vacillating - just choose one

vacillate - verb (used without object): to waver in mind or opinion; be indecisive or irresolute

A few personal traits I want to improve on:
-patience
-making up my mind quickly

About the second one - it's important to have the mentality of just choose one! for the sake of productivity. Really... think about the cost of vacillating, of being uncertain. It eats up time better spent with something else, and it's especially bad if the issue is insignificant.

Here's one example (beware: 99% of you won't know what I'm talking about): choose Webrat or Capybara. They're both testing frameworks for Ruby on Rails, and I ended up choosing Capybara because (1) it has ongoing support and (2) whoever I asked seemed to be using it. There, done and done. Even if I had gone the other way, it's no big deal.

Okay, if that example flew over your head, then think about day-to-day things. Here's one: Which homework do I do first? If you're a high school student staring blankly at assignments crudely jotted down in your planner, wondering where to start, just choose one! In the end, you're going to do them all anyway (right?), so it's inconsequential where you start. The impact is just... nil.

So you probably get the idea that I appreciate maximum efficiency. Very true.
I really strive to make my time as productive as possible - which, actually, might tie to my patience threshold...

Tuesday, June 28, 2011

Authlogic steaming issue in Rails 3.1

At the moment, there's a streaming issue with Authlogic and Rails 3.1:
https://github.com/binarylogic/authlogic/issues/262#issuecomment-1460466

Apparently, Authlogic violates MVC in some entrenched way, and consequently this gem breaks Rails 3.1's new streaming.  Hopefully there will be a solution soon, but for now, if you get a

"Cannot modify cookies because it was closed. This means it was already streamed back to the client or converted to HTTP headers. (ActionDispatch::ClosedError)"

 (as I did when I run my cucumber tests together), you'll have to make do and e.g. run your cucumber tests independently by using tags (example: cucumber -t @authlogic-one).

(.+) and ([^"]*) - Regexps

The Cucumber testing framework of Rails generates "([^"]*)" if you have an undefined step like
Given a user named "John"
But you can also use "(.+)" or if you don't want the quotes, simply (.+).  Basically, (.+) and ([^"]*) are equivalent.

Misc. Tags: regexp, regular expression

Monday, June 27, 2011

Using MySQL as your development database

On my Mac OS X, I'm starting a new project with Rails 3.1, and I'm using MySQL as my development database because when I deploy, MySQL will be the production database, and the two dbs really should match.

Unfortunately, after running
rails new mynewapp -d=mysql
and testing out
rails server
I got this error:
error: 'Can't connect to local MySQL server through socket '/tmp/mysql.sock' (2)'
Of course I hadn't (1) started the MySQL server yet and (2) created my development database. This was new to me since I normally just developed with the sqlite db Rails automatically ships with for development.

After using homebrew to install mysql ("brew install mysql" as Ryan Bigg suggests at http://ryanbigg.com/2011/06/mac-os-x-ruby-rvm-rails-and-you/), I followed the added installation instructions that come with the brew install (viewable through "brew info mysql"
- pretty darn important to finish the mysql install with homebrew), and then ran these commands:

# start the server - something I'm not used to after using sqlite
mysql.server start
# create the db; specify -p if necessary
mysqladmin -u root create mynewapp_development

Then, "rails server" was successful.

(Note: running "mysql.server start" won't be necessary if you follow the instructions from "brew info mysql" and configure the start server on startup)

Friday, June 17, 2011

Speed test: rake vs. rspec spec, rake cucumber vs cucumber

I timed different rspec commands to see which ran faster.

These tests were done on a really old machine so the times would be drawn out.
Newer machines execute far quicker with negligible difference.

(Precondition: all tests run with regard to the same code.)

Test 1: rake spec:controllers vs. rspec spec/controllers
Results:
rake spec:controllers --> 23 seconds
rspec spec/controllers --> 15 seconds

Test 2: rake spec:models vs. rspec spec/models
Results:
rake spec:models --> 34 seconds
rspec spec/models --> 16 seconds

Test 3: rake spec:views vs. rspec spec/views
Results:
rake spec:views --> 20 seconds
rspec spec/views --> 11 seconds

Test 4: rake spec vs. rspec spec
Results:
rake spec --> 20 seconds
rspec spec --> 11 seconds

Conclusion: It is faster to use "rspec spec/..." than the broader "rake spec" command.

This makes sense because the rake command has to invoke the Rakefile first, and from what I've seen online, "rake spec" and "rake cucumber" also invoke "rake db:test:prepare" behind the scenes first - a necessary command to restore the test database from scratch if you blew away the old one.

Here's one more test I did - this time, with cucumber:

Test 5: cucumber vs. rake cucumber
Results:
rake cucumber --> 30 seconds
cucumber --> 21 seconds

Yep.

Tuesday, June 14, 2011

Capybara (click_link, JS problem) & Logging in directly with Authlogic

The click_link / JS problem
I'm using capybara, and I recently came across this error:
Given I am on the home page                              # features/step_definitions/web_steps.rb:42
And I am the registered poster "John Doe"                # features/step_definitions/user_steps.rb:1
When I log in                                            # features/step_definitions/user_steps.rb:11
      You have a nil object when you didn't expect it!
      You might have expected an instance of Array.
      The error occurred while evaluating nil.[] (NoMethodError)
      (eval):2:in `click_link'
      ./features/step_definitions/user_steps.rb:12:in `/^I log in$/'
      features/users_have_basic_abilities.feature:10:in `When I log in'
This was my step definition:

When /^I log in$/ do
  click_link "Login"
end

This is the kind of error that arises if the link you want to click has href="javascript:void(0)".  In my case, my link was

<a href="javascript:void(0)" id="login_activator">Login</a>

With the help of jQuery, this link toggles a div for the login form, but there is no easy way to test this with Capybara because Capybara uses RackTest, which doesn't execute JS.

My solution: avoid the complications of this JS testing (I'm positive it works anyway), and instead forcibly create user session to login.  After all, that's the whole point of this logging in thing - it should be preliminary for what I really want to test.

Logging in directly with Authlogic
I am using Authlogic (https://github.com/binarylogic/authlogic) for user validation.
While trying code like UserSession.create!(:login => etc., :password => "some_password" ...), I came across this error:
You must activate the Authlogic::Session::Base.controller with a controller object before creating objects (Authlogic::Session::Activation::NotActivatedError)
The solution: ensure the following code is executed before such a user session creation:

Authlogic::Session::Base.controller = Authlogic::ControllerAdapters::RailsAdapter.new(self)

Yep.
Unfortunately, in the end even there still were problems with the "save" method no longer working - something about the improper number of arguments.
So, my FINAL solution was that I had to use the Cucumber steps of navigating to some page, filling in the login form, clicking the button, etc.  I guess there never really was a good solution to all of this, so I recommend against using the type of link I did above (a link that opens a little form on the page).

Monday, June 13, 2011

The importance of specifying gem versions in your Gemfile

I'm starting to integrate cucumber into a Rails 3 application.  Specifically, I added 'cucumber-rails' to my Gemfile, but I kept getting this error when I ran "rake cucumber":
undefined method `click' for class `Capybara::Driver::RackTest::Node' (NameError)
Turns out it is very important to specify gem versions in your Gemfile.  I hadn't specified a version for cucumber-rails, so consequently, abstruse incompatibilities emerged.

All I had to do was change gem 'cucumber-rails' to gem 'cucumber-rails', '>= 0.5.1'

This is what my Gemfile currently looks like:

source 'http://rubygems.org'
gem 'rails', '3.0.3'
gem 'authlogic', :git => "git://github.com/radar/authlogic.git"
gem 'cancan'
gem 'paperclip', :git => "git://github.com/thoughtbot/paperclip.git"
gem 'acts_as_list', :git => "https://github.com/haihappen/acts_as_list.git"
gem "will_paginate", "~> 3.0.pre2"
gem 'dynamic_form'
gem 'mysql2'
gem 'nokogiri'
gem 'capistrano' # we deploy with capistrano

group :development, :test do
 gem 'sqlite3-ruby', :require => 'sqlite3'
 gem 'shoulda'
 gem 'cucumber-rails', '>= 0.5.1' # important to specify this version
 gem 'database_cleaner'
 gem 'capybara'
 gem "rspec-rails", ">= 2.0.0"
end

# These typically shouldn't be for production, but I need it for seed data
gem 'factory_girl_rails'
gem 'forgery'

*Also, it's important to re-run rails generate cucumber:install after a change related to cucumber in your Gemfile.

Sunday, June 12, 2011

Awards. Superficial?

Society has a clever way of motivating people. It has this concept called the "award," and it dates back to ages ago (because the Greeks and Romans certainly held contests and rewarded winners).

But have you ever saw through the whole affair? Have you ever gotten tired of them? Casting aside all pretension, I've gone to many, and over the years, they have become dry and dull. Yes, awards are great, but maybe the countless times I've sensed unjustness have led me to simply grow bored of them. People do not necessarily win things deservingly. For many awards, voting is a large factor and by consequence popularity as well.

Even, in my case, when I claimed the top through legitimate hard work, as I did in a few math contests, I gradually gained a feeling of "who cares?" It's all part of society's game to improve you.... which is of course good and noble and blablabla but the entire affair of praise, fame, etc. taints all value. Even when I won, I found myself shrugging with a mild grin on my face.

I've really come to treasure intrinsic value. This is when I feel real accomplishment. Think of the people who invented the internet. Oh, that's right, you CAN'T. It's not even clear to me who invented the internet... yet this invention is clearly one of the momentous in human history. The internet is responsible for millions of businesses, for creativity to outlet in infinite forms, for connecting society in a whole new way. Yet we don't know who the inventors of the internet are. Sure, go ahead, Google the answer. You'll come across a name, or several, but the fact of the matter is you probably didn't know these people before. Fame and praise had alluded them... but the intrinsic value of their achievement is paramount.

Friday, June 10, 2011

Who cares?

Hey, you're reading this. Maybe. But do I care what you think? Do you care what I think of you?

This past year, I've developed a quasi-nihilistic view of society. It probably has something to do with the college admission process, as I am currently wrapping up my senior year of high school.

It boils down to this: Does anyone really care? Okay, I'm not pessimistic (really - I'm actually very optimistic, and I can back it up with this incredible Time magazine article), but let's consider a few of the superficialities of American society, the ephemeral moments of praise and prestige that people get. Think of the compliments you get from classmates whom you don't really know that well, the congrats you hear given to others for winning Award X. It's mostly out of respect, isn't it? Or just for the fun of yelling something? When you really think about it, with regard to success, people primarily care about themselves and others' perception of themselves.
(disclaimer: this omits philanthropic caring. Also, I would like to re-emphasize the with regard to success)

This whole idea relates to something else you might find interesting: childhood behavior. See, I have a five-year-old sister, and I've observed quite the number of interesting things about certain behavior - behavior that epitomizes that of all children. I'm only going to focus on the one that is relevant here, and that is the sheer outrage of being explained something or told to do something that the child already knows about. Haven't you seen a kid just heat up in fury because someone older reminds the kid to wash his hands or do something he was going to do anyway? Part of this rage comes from the kid's desire to be seen by others as one who already knows, one who's more mature. He really cares about what others think - to the point that it pains him that anyone reminding him something could suggest he didn't know about it. Phrased differently, he recognizes extrinsic value, not intrinsic value. And, interestingly, as we mature, we develop a stronger sense of intrinsic value, a profounder feeling of value in that something simply exists, even if no one recognizes it.

In my experience, when it comes to success, society seems to blow up truthfully minor things. Society's praise becomes excessive, and meanwhile, deep down, no one really cares... people deep down focus on themselves.

(A little more rambling: what IS society, anyway? It's not any one person, though one person could very well contribute to this entity we call society...)

Sunday, June 5, 2011

"I love you." (Maybe?)

At least in the part of American society I've grown up in...

Why is it so common to say, "I love you," to all friends alike?  Doesn't this stifle the original power of the words?  More importantly, doesn't it make it harder to tell if someone is serious?

Thursday, June 2, 2011

Quick SQL command... those annoying quotes

Never ever forget the importance of details.

Here's an example of why.

I wanted to drop the "default" setting for a column of type string.  Rails' typical migrations do not have a convenient way of doing this, so I had to run an SQL command.
I admit that I at the time was not experienced with SQL.  Nonetheless I went to http://api.rubyonrails.org/classes/ActiveRecord/Migration.html to find the answer.  I found this:

Instead of copying and pasting and then editing appropriately, I typed it out... and failed to notice the appearance of the single-quotes.  It's really better to just copy and paste so the tiniest details are taken care of... in this case, typing in ' does not work; you have to use `.

I really wish I had scrutinized every single character - particularly any quotes.  It would have saved me quite some time.

The end result:
execute "ALTER TABLE `free_spaces` ALTER COLUMN `content` DROP DEFAULT"

Monday, May 30, 2011

Reflections on website deployment

This past weekend, I finally got to see jdrampage.com get deployed.  It's an application I wrote in Rails 3 that I am quite proud of, one that is for my high school's online student newspaper, and it is an upgrade from the Rails 2 version created two years ago.

What I learned in the process of its deployment was particularly interesting.  There was a huge disparity in understanding between roles - the role of myself, these people we pay to host the site, and this sysadmin who we just spontaneously hired.  The sysadmin, who was solely a consultant, didn't know about the background of the old site's setup any more than I did.  (Both of us knew nothing about the old site.)  And I thought all I had to do was develop the code and then just pass it off to someone who would then set it up.  It turned out that I had to actively engage in some parts of the deployment too... like setting up capistrano (http://help.github.com/capistrano/) and sshing and public key stuff and ladeedah.

I guess the experience was quite the eye-opener...

Bottom line: If you're a Rails developer, or perhaps any developer, it's really important to have a clear understanding of what is expected of your developed application on the production end of things. Don't just assume you can make the app and hand it off to someone who'll set it all up for you.

(...now that might come off as a rather obvious bottom line but hey it's important...)



Now, one other point (which is more on the technical side): I encountered an encoding issue in the database transfer of the old site.  Apparently, the database went from sqlite3 to mysql... UTF8 encoding wasn't used properly on the old site, so when the database transfer was made, the new site had odd things like...
  • "| instead of :
  • – instead of -
  • ’ instead of '
  • “ instead of "
  • †instead of "
In my case, it wasn't difficult to manually go in and change things, but it's probably a good idea to take note of encoding before making any database transfer.

Sunday, May 29, 2011

First post

I've decided to do away with the unnecessary formality of blogs (so excuse the neat typing).
In other words, I'm not going to make this a burden - and I'm not going to make it a laborious task, an energy-sapping hobby.

I'll just write spontaneously.  And hopefully, some of the thoughts that arise here will benefit others in the future.

:]