![]() ![]() ![]() ![]() We can test this by trying to ssh into localhost without using a password: To configure ssh in the way that it can connect from localhost to localhost without needing a password, we need to add you public key to your authorized keys.Ĭat ~/.ssh/id_rsa.pub > ~/.ssh/authorized_keys Hadoop will connect to localhost using ssh. Now, we need to create and format the directory. Hadoop is now configured to use /tmp/hadoop as HDFS folder. Go to /usr/local/Cellar/hadoop/0.20.2/libexec/conf and change the following files: But this should work, too:Īfter you finished the hadoop installation, we need to edit a bunch of files in order to configure hadoop for local single node setup. I edited my Hadoop formula locally on my mac. The current version of Hadoop is pinned to 0.21 wich is an unstable version that, AFAIK, doesn't play together with Flume. So, the first step is as easy as:īut don't think that you can just install Hadoop with brew. This is why I decided to write a quick tutorial to get things up and running. It took me a while to figure out, how Flume and Hadoop have to be configured so that receiving messages are getting written into HDFS. No seriously - this setup is especially useful, if you want to route your syslog output from your nodes to HDFS in order to process it later using Map/Reduce jobs. For a kick-ass webscale big-data setup on your local mac, you'll want to have Hadoop and Flume place.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |