언제 부터인가 우분투에서는 openJDK가 기본으로 설치되고, /usr/bin/java도 기본으로 openJDK를 사용하도록 되어 있다.


Oracle에서 제공하는 Sun JDK를 설치하고 나서 기본 설정을 바꿔주기 위해서는 아래와 같이 하면 끝난다.(패스를 설정해줄수도 있지만)


/usr/local/java-6-sun 에 Sun JDK를 설치했다고 가정함.


nuke@ubuntu:~$ sudo update-alternatives --install "/usr/bin/java" "java" "/usr/local/java-6-sun/bin/java" 1


nnuke@ubuntu:~$ sudo update-alternatives --install "/usr/bin/javac" "javac" "/usr/local/java-6-sun/bin/javac" 1

update-alternatives: using /usr/local/java-6-sun/bin/javac to provide /usr/bin/javac (javac) in auto mode.


nuke@ubuntu:~$ sudo update-alternatives --install "/usr/bin/javaws" "javaws" "/usr/local/java-6-sun/bin/javaws" 1

update-alternatives: using /usr/local/java-6-sun/bin/javaws to provide /usr/bin/javaws (javaws) in auto mode.


nuke@ubuntu:~$ sudo update-alternatives --config java

There are 2 choices for the alternative java (providing /usr/bin/java).


  Selection    Path                                           Priority   Status

------------------------------------------------------------

* 0            /usr/lib/jvm/java-6-openjdk-i386/jre/bin/java   1061      auto mode

  1            /usr/lib/jvm/java-6-openjdk-i386/jre/bin/java   1061      manual mode

  2            /usr/local/java-6-sun/bin/java                         1         manual mode


Press enter to keep the current choice[*], or type selection number: 2

update-alternatives: using /usr/local/java-6-sun/bin/java to provide /usr/bin/java (java) in manual mode.


nuke@ubuntu:~$ java -version

java version "1.6.0_37"

Java(TM) SE Runtime Environment (build 1.6.0_37-b06)

Java HotSpot(TM) Client VM (build 20.12-b01, mixed mode, sharing)


이전에는 패스를 설정해서 oracle java가 먼저 실행되도록 했었지만, 기왕이면 설정을 등록하고 디폴트 설정을 선택할수 있으니 앞으로는 이와 같은 방법을 사용하여 더 편하게 사용하면 된다.

블로그 이미지

커뉴

이 세상에서 꿈 이상으로 확실한 것을, 인간은 가지고 있는 것일까?

,

Honey Comb(Android 3)부터 Fragment라는 것이 들어왔는데, 그동안 프로바이더나 네트워크 Sync쪽으로만 너무 일이 몰려있던 바람에 Fragment를 이것 저것 많이 다루지 못했던것 같다.


그래서 정말 진짜,, 기초적으로!!! 나중에 참고할일이 있을때 바로 샘플로 사용하기 위해서 간단하 Activity 하나에 Fragment를 두개를 붙이는 예제를 만들었다.

아무런 내용도 없는 진짜 껍데기만 존재하는 코드라서, 나중에 필요한 부분을 붙여넣을 일이 생길때만 사용하면 된다.





너무 휑~~ 하다. 소스도 역시 내용이 없다, 필요할때 채워서 써야 한다. 


1. Main Layout

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    android:orientation="vertical" >
    <fragment
        android:id="@+id/fragment1"
        android:layout_width="fill_parent"
        android:layout_height="0dp"
        android:layout_weight="1"
        class="com.hopeisagoodthing.fragments.Fragment1" />
    <fragment
        android:id="@+id/fragment2"
        android:layout_width="fill_parent"
        android:layout_height="0dp"
        android:layout_weight="1"
        class="com.hopeisagoodthing.fragments.Fragment2" />
</LinearLayout>



2. Fragment1 layout

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >
<TextView xmlns:android="http://schemas.android.com/apk/res/android"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_centerHorizontal="true"
        android:layout_centerVertical="true"
        android:text="@string/fragment1"
        tools:context=".MainActivity" />

</RelativeLayout>



3. Fragment2 layout

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >
<TextView xmlns:android="http://schemas.android.com/apk/res/android"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_centerHorizontal="true"
        android:layout_centerVertical="true"
        android:text="@string/fragment2"
        tools:context=".MainActivity" />

</RelativeLayout>



4. MainActivity.java

package com.hopeisagoodthing.fragments;

import android.app.Activity;
import android.os.Bundle;

public class MainActivity extends Activity {
    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);
    }
}



5. Fragment1.java

package com.hopeisagoodthing.fragments;

import android.app.Fragment;
import android.os.Bundle;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;


public class Fragment1 extends Fragment {

	@Override
	public View onCreateView(LayoutInflater inflater,
			ViewGroup container, Bundle savedInstanceState) {
		View v = inflater.inflate(R.layout.fragment1, container, false);	

		return v;
	}

}



6. Fragment2.java

package com.hopeisagoodthing.fragments;

import android.app.Fragment;
import android.os.Bundle;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;

public class Fragment2 extends Fragment {

	@Override
	public View onCreateView(LayoutInflater inflater,
			ViewGroup container, Bundle savedInstanceState) {
		View v = inflater.inflate(R.layout.fragment2, container, false);		

		return v;
	}

}

블로그 이미지

커뉴

이 세상에서 꿈 이상으로 확실한 것을, 인간은 가지고 있는 것일까?

,

오늘은 심심해서 Android 4.x에 새로 추가된것들에 대해서 보고 있었는데, 그중에 재미난것이 하나 있어서 코딩해볼까?? 하고 eclipse를 열었다.


그런데, 코딩할필요가 없었다.





이유는 아예 AnalogClock view를 지원하고 있다. 좋다!!!! 아무것도 할것 없이 그냥 아래 view 만 추가해주면 모든게 끝난다.


<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"

    xmlns:tools="http://schemas.android.com/tools"

    android:layout_width="match_parent"

    android:layout_height="match_parent" >


    <AnalogClock xmlns:android="http://schemas.android.com/apk/res/android"

    android:id="@+id/analog"

    android:layout_width="match_parent"

    android:layout_height="match_parent" >

    </AnalogClock>


</RelativeLayout>


'코딩하고 > Android' 카테고리의 다른 글

DB관리하기 - Provider 사용하기  (0) 2012.11.21
Fragment 다루기 (아주 심플하게..)  (0) 2012.11.15
HttpClient 사용하기  (0) 2012.10.25
File Observer 사용하기  (0) 2012.10.21
ActionBar로 Tab 만들기  (19) 2012.10.19
블로그 이미지

커뉴

이 세상에서 꿈 이상으로 확실한 것을, 인간은 가지고 있는 것일까?

,

코드를 읽거나, 코드를 작성하거나 할때, 일단 라인수가 큰 코드, 데이터와 코드가 분리되어 있지 않은 코드들을 보면 나도 모르게 불안해진다. 


마음속 깊은곳에서 "고쳐야돼.. 제거해야해... 줄여야해!!" 하는 소리가 들리는것 같은 기분으로 불안한 상태로 코드를 제대로 작성하거나 읽을수가 없다.


포인터가 사용가능한 C언어류를 사용할때에도 Switch나 if를 많이 쓰게 되면 함수포인터로 테이블을 생성하여 사용하거나 했는데, 자바로 넘어오니 어랏??? 없다. 포인터 같은게...


그래서 interface 와 map을 사용하여 switch - case 들을 제거하여 코딩하고 있다.


제거하면 나중에 좋을까?? 


상황에 따라 다르지만, 일반적으로 코드량을 줄일수 있다. 물론 switch - case로 가장 컴팩트하게 코딩을 할수 있는 경우도 많지만, 보통 case에서 구분되어야 하는 경우의 수를 구분짓기 위해 일단 int 형 구분 값이 고정적으로 들어간다.( case [constant 값] : 이런식으로 코딩을 하니까 )

map에서는 그 구분하는 것을 array의 index를 사용하던, 특정한 키 object를 사용하여 구분짓던 무엇이든지 map을 만들수 있는 방법이면 다 사용가능하다. 그래서 case에 대한 정의를 따로 할 필요가 없다. 키만 구분할수 있게 애초에 입력을 받으면 된다.


그리고, case가 추가되거나 제거될때는 map부분만 수정하고 추가하면 된다, switch를 구분하는 곳에서는 더이상수정할 필요가 없어지게 된다. 

이로 인해 logic을 관리하는 클래스와 case들을 관리하는 class로 구분이 가능하게 된다. Map들만 따로 클래스로 생성해서 나눈다음에 상황에 맞는 Map을 logic으로 올려주도록 하면 switch case가 portable하게 만들어지는 것이다.


학부시절 C언어로 뭔가를 만들어오라는 과제를 할때는 Switch Case를 당연히 사용하곤했었는데, 계속 프로그래밍을 하다보니, if,switch,case등을 계속 추가하게 되는 코드들은 나를 너무 불안하게 만들었다. ㅠ.ㅠ


요즘에는 참여하는 프로젝트에서 누군가가 Switch -case를 쓰면 나도 모르게 ctrl - a , delete를 누르게 된다. @.@


예제 코드는 정말 이런식으로 코딩할리 없겠지만, 어떻게 interface로 case를 구분하는냐에 대한 예제정도로만 봐주면 좋을것 같다.



package com.hopeisagoodthing.removeswitch;

import java.util.HashMap;
import java.util.Map;
import java.util.Random;

public class Main {

	/**
	 * @param args
	 */
	private interface IMethod{
		void method();
	}
	
	private static Map<Integer,IMethod> methodArray= new HashMap<Integer,IMethod>();
	static{
		methodArray.put(1, new IMethod(){
			@Override
			public void method() {
				System.out.println("1");				
			}});
		
		methodArray.put(2, new IMethod(){
			@Override
			public void method() {
				System.out.println("2");				
			}});
		
		methodArray.put(3, new IMethod(){
			@Override
			public void method() {
				System.out.println("3");				
			}});
		
		methodArray.put(4, new IMethod(){
			@Override
			public void method() {
				System.out.println("4");				
			}});
		
		methodArray.put(5, new IMethod(){
			@Override
			public void method() {
				System.out.println("5");				
			}});
		
		methodArray.put(6, new IMethod(){
			@Override
			public void method() {
				System.out.println("6");				
			}});
		
		methodArray.put(7, new IMethod(){
			@Override
			public void method() {
				System.out.println("7");				
			}});
		
		methodArray.put(8, new IMethod(){
			@Override
			public void method() {
				System.out.println("8");				
			}});
		
		methodArray.put(9, new IMethod(){
			@Override
			public void method() {
				System.out.println("9");				
			}});
		
		methodArray.put(10, new IMethod(){
			@Override
			public void method() {
				System.out.println("10");				
			}});
		
		methodArray.put(11, new IMethod(){
			@Override
			public void method() {
				System.out.println("11");				
			}});
		
		methodArray.put(12, new IMethod(){
			@Override
			public void method() {
				System.out.println("12");				
			}});
		
		methodArray.put(13, new IMethod(){
			@Override
			public void method() {
				System.out.println("13");				
			}});
		
		methodArray.put(14, new IMethod(){
			@Override
			public void method() {
				System.out.println("14");				
			}});
		
		methodArray.put(15, new IMethod(){
			@Override
			public void method() {
				System.out.println("15");				
			}});
		
		methodArray.put(16, new IMethod(){
			@Override
			public void method() {
				System.out.println("16");				
			}});
		
		methodArray.put(17, new IMethod(){
			@Override
			public void method() {
				System.out.println("17");				
			}});
		
		methodArray.put(18, new IMethod(){
			@Override
			public void method() {
				System.out.println("18");				
			}});
		
		methodArray.put(19, new IMethod(){
			@Override
			public void method() {
				System.out.println("19");				
			}});
		
		methodArray.put(20, new IMethod(){
			@Override
			public void method() {
				System.out.println("20");				
			}});
		
		methodArray.put(21, new IMethod(){
			@Override
			public void method() {
				System.out.println("21");				
			}});
		
		methodArray.put(22, new IMethod(){
			@Override
			public void method() {
				System.out.println("22");				
			}});
		
		methodArray.put(23, new IMethod(){
			@Override
			public void method() {
				System.out.println("23");				
			}});
		
		methodArray.put(24, new IMethod(){
			@Override
			public void method() {
				System.out.println("24");				
			}});
		
		methodArray.put(25, new IMethod(){
			@Override
			public void method() {
				System.out.println("25");				
			}});
		
		methodArray.put(26, new IMethod(){
			@Override
			public void method() {
				System.out.println("26");				
			}});
		
		methodArray.put(27, new IMethod(){
			@Override
			public void method() {
				System.out.println("27");				
			}});
		
		methodArray.put(28, new IMethod(){
			@Override
			public void method() {
				System.out.println("28");				
			}});
		
		methodArray.put(29, new IMethod(){
			@Override
			public void method() {
				System.out.println("29");				
			}});
		
		methodArray.put(30, new IMethod(){
			@Override
			public void method() {
				System.out.println("30");				
			}});
		
		methodArray.put(31, new IMethod(){
			@Override
			public void method() {
				System.out.println("31");				
			}});
		
		methodArray.put(32, new IMethod(){
			@Override
			public void method() {
				System.out.println("32");				
			}});
		
		methodArray.put(33, new IMethod(){
			@Override
			public void method() {
				System.out.println("33");				
			}});
		
		methodArray.put(34, new IMethod(){
			@Override
			public void method() {
				System.out.println("34");				
			}});
		
		methodArray.put(35, new IMethod(){
			@Override
			public void method() {
				System.out.println("35");				
			}});
		
		methodArray.put(36, new IMethod(){
			@Override
			public void method() {
				System.out.println("36");				
			}});
		
		methodArray.put(37, new IMethod(){
			@Override
			public void method() {
				System.out.println("37");				
			}});
		
		methodArray.put(38, new IMethod(){
			@Override
			public void method() {
				System.out.println("38");				
			}});
		
		methodArray.put(39, new IMethod(){
			@Override
			public void method() {
				System.out.println("39");				
			}});
		
		methodArray.put(40, new IMethod(){
			@Override
			public void method() {
				System.out.println("40");				
			}});
	}
	
	private static void withoutSwitch(int inA)
	{
		inA = inA%40+1;				
		final IMethod method = methodArray.get(inA);
		if(method!=null){
			method.method();
		}
		else
		{
			System.out.println("nothing");
		}		
	}
	
	private static void useSwitch(int inA)
	{
		inA = inA%40+1;
		switch(inA)
		{
		case 1:
			System.out.println("1");
			break;
		case 2:
			System.out.println("2");
			break;
		case 3:
			System.out.println("3");
			break;
		case 4:
			System.out.println("4");
			break;
		case 5:
			System.out.println("5");
			break;
		case 6:
			System.out.println("6");
			break;
		case 7:
			System.out.println("7");
			break;
		case 8:
			System.out.println("8");
			break;
		case 9:
			System.out.println("9");
			break;
		case 10:
			System.out.println("10");
			break;
		case 11:
			System.out.println("11");
			break;
		case 12:
			System.out.println("12");
			break;
		case 13:
			System.out.println("13");
			break;
		case 14:
			System.out.println("14");
			break;
		case 15:
			System.out.println("15");
			break;
		case 16:
			System.out.println("16");
			break;
		case 17:
			System.out.println("17");
			break;
		case 18:
			System.out.println("18");
			break;
		case 19:
			System.out.println("19");
			break;
		case 20:
			System.out.println("20");
			break;
		case 21:
			System.out.println("21");
			break;
		case 22:
			System.out.println("22");
			break;
		case 23:
			System.out.println("23");
			break;
		case 24:
			System.out.println("24");
			break;
		case 25:
			System.out.println("25");
			break;
		case 26:
			System.out.println("26");
			break;
		case 27:
			System.out.println("27");
			break;
		case 28:
			System.out.println("28");
			break;
		case 29:
			System.out.println("29");
			break;
		case 30:
			System.out.println("30");
			break;
		case 31:
			System.out.println("31");
			break;
		case 32:
			System.out.println("32");
			break;
		case 33:
			System.out.println("33");
			break;
		case 34:
			System.out.println("34");
			break;
		case 35:
			System.out.println("35");
			break;
		case 36:
			System.out.println("36");
			break;
		case 37:
			System.out.println("37");
			break;
		case 38:
			System.out.println("38");
			break;
		case 39:
			System.out.println("39");
			break;
		case 40:
			System.out.println("40");
			break;
			
		default:
			System.out.println("nothing");
			break;			
		}
		
	}
	private static final int TOTAL_TRIAL = 20;
	private static final int MAX_LOOP = 20000;
	public static void main(String[] args) {		
		Random urandom = new Random();
		int[] inputs = new int[MAX_LOOP];		
		int[] useSwitchResult = new int[TOTAL_TRIAL];
		int[] withoutSwitchResult = new int[TOTAL_TRIAL];
		
		for(int i=0;i<MAX_LOOP;i++){
			inputs[i] = urandom.nextInt();
		}
		
		
		for(int trial=0;trial<TOTAL_TRIAL;trial++)
		{
			System.gc();
			long start = System.currentTimeMillis();
			for(int a=0;a<500*trial;a++){						
				useSwitch(inputs[a]);
			}
			useSwitchResult[trial] = (int) (System.currentTimeMillis()-start);			
		}
		
				
		for(int trial=0;trial<TOTAL_TRIAL;trial++)
		{		
			System.gc();
			long start = System.currentTimeMillis();		
			for(int a=0;a<500*trial;a++){			
				withoutSwitch(inputs[a]);
			}
			withoutSwitchResult[trial] = (int) (System.currentTimeMillis()-start);
		}
		System.out.println("useSwitch :" );
		for(int trial=0;trial<TOTAL_TRIAL;trial++)
		{
			System.out.print(useSwitchResult[trial]+"\t");
		}
		
		System.out.println("\nwithoutSwitch :" );
		for(int trial=0;trial<TOTAL_TRIAL;trial++)
		{
			System.out.print(withoutSwitchResult[trial]+"\t");
		}				
	}
}



블로그 이미지

커뉴

이 세상에서 꿈 이상으로 확실한 것을, 인간은 가지고 있는 것일까?

,

얼마전에 Lucene으로 조그마한 프로젝트를 진행하면서 Inverted Index가 뭔지 처음알게 되었는데, 그때는 멋모르고 그냥 제공해주는거니까 안에는 어떤 원리로 돌아간느건지 뭘 만드는건지 모르고 사용했었다.


Hadoop을 공부하다보니, MapReduce를 사용하면 엄청 간단하게 Inverted Index를 생성할수가 있다.


일단 많은 데이터를 가진 파일들이 없어서 무료로 eBook을 제공해주는 Project Gutenberg 에서 추천 도서 10개를 TXT로 다운받아서 테스트 용도로 사용했다.

http://www.gutenberg.org/wiki/Main_Page


1. 데이터들을 Hadoop FS로 옮겨넣기

hadoop@ubuntu:~/work$ cd data

hadoop@ubuntu:~/work/data$ ls

matrix_input.2x2   pg11.txt   pg1342.txt   pg30601.txt  pg5000.txt

matrixmulti.2x3x2  pg132.txt  pg27827.txt  pg4300.txt   prince.txt


hadoop@ubuntu:~/work$ hadoop dfs -put data /data

hadoop@ubuntu:~/work$ ../bin/hadoop/bin/hadoop dfs -ls /data/data

Found 8 items

-rw-r--r--   1 hadoop supergroup     167497 2012-11-11 07:06 /data/data/pg11.txt

-rw-r--r--   1 hadoop supergroup     343695 2012-11-11 07:06 /data/data/pg132.txt

-rw-r--r--   1 hadoop supergroup     704139 2012-11-11 07:06 /data/data/pg1342.txt

-rw-r--r--   1 hadoop supergroup     359504 2012-11-11 07:06 /data/data/pg27827.txt

-rw-r--r--   1 hadoop supergroup     384522 2012-11-11 07:06 /data/data/pg30601.txt

-rw-r--r--   1 hadoop supergroup    1573112 2012-11-11 07:06 /data/data/pg4300.txt

-rw-r--r--   1 hadoop supergroup    1423801 2012-11-11 07:06 /data/data/pg5000.txt

-rw-r--r--   1 hadoop supergroup      92295 2012-11-11 07:06 /data/data/prince.txt


2. InvertedIndex를 생성하는 MapReduce 코딩하기(코딩이 너무 간단하닷!!!!)


package hopeisagoodthing;

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
import org.apache.hadoop.mapreduce.lib.input.FileSplit;
import org.apache.hadoop.io.LongWritable;

public class InvertedIndex {
	public static class InvertedIndexMapper extends Mapper<Object,Text,Text,Text> {
		private Text outValue = new Text();
		private Text word = new Text();
		private static String docId = null;
		
		public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
			StringTokenizer iter = new StringTokenizer(value.toString()," ",true);
			Long pos = ((LongWritable)key).get();
			while ( iter.hasMoreTokens() ) {
				String token = iter.nextToken();
				if(token.equals(" "))
				{
					pos = pos + 1;
				}
				else
				{
					word.set(token);
					outValue.set(docId+":"+pos);
					pos = pos + token.length();
					context.write(word,outValue);
				}
			}
		}
		
		
	
		protected void setup(Context context) throws IOException, InterruptedException {
			docId = ((FileSplit)context.getInputSplit()).getPath().getName();			
		}
	}
	public static class InvertedIndexReducer extends Reducer<Text,Text,Text,Text> {
		private Text counter = new Text();
		public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
			StringBuilder countersb = new StringBuilder();
			for ( Text val : values ) {
				countersb.append(val);
				countersb.append(",");
			}
			countersb.setLength(countersb.length()-1);
			
			counter.set(countersb.toString());
			context.write(key,counter);
		}
	}
	public static void main(String[] args) throws Exception {		
		Configuration conf = new Configuration();
		String[] otherArgs = new GenericOptionsParser(conf,args).getRemainingArgs();
		
		Job job = new Job(conf,"invertedindex");
		job.setJarByClass(InvertedIndex.class);
		job.setMapperClass(InvertedIndexMapper.class);
		job.setReducerClass(InvertedIndexReducer.class);
		job.setOutputKeyClass(Text.class);
		job.setOutputValueClass(Text.class);
		job.setNumReduceTasks(2);	

		FileInputFormat.addInputPath(job,new Path(otherArgs[0]));
		FileOutputFormat.setOutputPath(job,new Path(otherArgs[1]));
		System.exit(job.waitForCompletion(true) ? 0 : 1 );
	}	
}



3. Hadoop에서 실행시켜보기.

hadoop@ubuntu:~/work$ ../bin/hadoop/bin/hadoop jar hopeisagoodthing.jar invertedindex -jt local /data/data /data/invertedindext_out

12/11/11 07:16:29 INFO util.NativeCodeLoader: Loaded the native-hadoop library

12/11/11 07:16:29 INFO input.FileInputFormat: Total input paths to process : 10

12/11/11 07:16:29 WARN snappy.LoadSnappy: Snappy native library not loaded

12/11/11 07:16:29 INFO mapred.JobClient: Running job: job_local_0001

12/11/11 07:16:30 INFO util.ProcessTree: setsid exited with exit code 0

12/11/11 07:16:30 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@93d6bc

12/11/11 07:16:30 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:30 INFO mapred.JobClient:  map 0% reduce 0%

12/11/11 07:16:32 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:32 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:33 INFO mapred.MapTask: Spilling map output: record full = true

12/11/11 07:16:33 INFO mapred.MapTask: bufstart = 0; bufend = 6288000; bufvoid = 99614720

12/11/11 07:16:33 INFO mapred.MapTask: kvstart = 0; kvend = 262144; length = 327680

12/11/11 07:16:33 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:33 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:33 INFO mapred.MapTask: Finished spill 1

12/11/11 07:16:33 INFO mapred.Merger: Merging 2 sorted segments

12/11/11 07:16:33 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 6967137 bytes

12/11/11 07:16:34 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting

12/11/11 07:16:35 INFO mapred.LocalJobRunner: 

12/11/11 07:16:35 INFO mapred.LocalJobRunner: 

12/11/11 07:16:35 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done.

12/11/11 07:16:35 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1b64e6a

12/11/11 07:16:35 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:36 INFO mapred.JobClient:  map 100% reduce 0%

12/11/11 07:16:36 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:36 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:37 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:38 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:38 INFO mapred.Task: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting

12/11/11 07:16:38 INFO mapred.LocalJobRunner: 

12/11/11 07:16:38 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0' done.

12/11/11 07:16:38 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@161dfb5

12/11/11 07:16:38 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:39 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:39 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:39 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:39 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:39 INFO mapred.Task: Task:attempt_local_0001_m_000002_0 is done. And is in the process of commiting

12/11/11 07:16:41 INFO mapred.LocalJobRunner: 

12/11/11 07:16:41 INFO mapred.Task: Task 'attempt_local_0001_m_000002_0' done.

12/11/11 07:16:42 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@c09554

12/11/11 07:16:42 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:42 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:42 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:42 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:42 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:42 INFO mapred.Task: Task:attempt_local_0001_m_000003_0 is done. And is in the process of commiting

12/11/11 07:16:45 INFO mapred.LocalJobRunner: 

12/11/11 07:16:45 INFO mapred.Task: Task 'attempt_local_0001_m_000003_0' done.

12/11/11 07:16:45 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1309e87

12/11/11 07:16:45 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:45 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:45 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:45 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:45 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:45 INFO mapred.Task: Task:attempt_local_0001_m_000004_0 is done. And is in the process of commiting

12/11/11 07:16:48 INFO mapred.LocalJobRunner: 

12/11/11 07:16:48 INFO mapred.Task: Task 'attempt_local_0001_m_000004_0' done.

12/11/11 07:16:48 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6c585a

12/11/11 07:16:48 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:48 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:48 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:48 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:48 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:48 INFO mapred.Task: Task:attempt_local_0001_m_000005_0 is done. And is in the process of commiting

12/11/11 07:16:51 INFO mapred.LocalJobRunner: 

12/11/11 07:16:51 INFO mapred.Task: Task 'attempt_local_0001_m_000005_0' done.

12/11/11 07:16:51 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@e3c624

12/11/11 07:16:51 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:51 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:51 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:51 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:51 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:51 INFO mapred.Task: Task:attempt_local_0001_m_000006_0 is done. And is in the process of commiting

12/11/11 07:16:54 INFO mapred.LocalJobRunner: 

12/11/11 07:16:54 INFO mapred.Task: Task 'attempt_local_0001_m_000006_0' done.

12/11/11 07:16:54 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1950198

12/11/11 07:16:54 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:54 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:54 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:54 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:54 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:54 INFO mapred.Task: Task:attempt_local_0001_m_000007_0 is done. And is in the process of commiting

12/11/11 07:16:57 INFO mapred.LocalJobRunner: 

12/11/11 07:16:57 INFO mapred.Task: Task 'attempt_local_0001_m_000007_0' done.

12/11/11 07:16:57 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@53fb57

12/11/11 07:16:57 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:16:57 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:16:57 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:16:57 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:16:57 INFO mapred.MapTask: Finished spill 0

12/11/11 07:16:57 INFO mapred.Task: Task:attempt_local_0001_m_000008_0 is done. And is in the process of commiting

12/11/11 07:17:00 INFO mapred.LocalJobRunner: 

12/11/11 07:17:00 INFO mapred.Task: Task 'attempt_local_0001_m_000008_0' done.

12/11/11 07:17:00 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1742700

12/11/11 07:17:00 INFO mapred.MapTask: io.sort.mb = 100

12/11/11 07:17:00 INFO mapred.MapTask: data buffer = 79691776/99614720

12/11/11 07:17:00 INFO mapred.MapTask: record buffer = 262144/327680

12/11/11 07:17:00 INFO mapred.MapTask: Starting flush of map output

12/11/11 07:17:00 INFO mapred.MapTask: Finished spill 0

12/11/11 07:17:00 INFO mapred.Task: Task:attempt_local_0001_m_000009_0 is done. And is in the process of commiting

12/11/11 07:17:03 INFO mapred.LocalJobRunner: 

12/11/11 07:17:03 INFO mapred.Task: Task 'attempt_local_0001_m_000009_0' done.

12/11/11 07:17:03 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@491c4c

12/11/11 07:17:03 INFO mapred.LocalJobRunner: 

12/11/11 07:17:03 INFO mapred.Merger: Merging 10 sorted segments

12/11/11 07:17:03 INFO mapred.Merger: Down to the last merge-pass, with 10 segments left of total size: 22414959 bytes

12/11/11 07:17:03 INFO mapred.LocalJobRunner: 

12/11/11 07:17:05 INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting

12/11/11 07:17:05 INFO mapred.LocalJobRunner: 

12/11/11 07:17:05 INFO mapred.Task: Task attempt_local_0001_r_000000_0 is allowed to commit now

12/11/11 07:17:05 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to /data/invertedindext_out

12/11/11 07:17:06 INFO mapred.LocalJobRunner: reduce > reduce

12/11/11 07:17:06 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done.

12/11/11 07:17:06 INFO mapred.JobClient:  map 100% reduce 100%

12/11/11 07:17:06 INFO mapred.JobClient: Job complete: job_local_0001

12/11/11 07:17:06 INFO mapred.JobClient: Counters: 22

12/11/11 07:17:06 INFO mapred.JobClient:   File Output Format Counters 

12/11/11 07:17:06 INFO mapred.JobClient:     Bytes Written=16578538

12/11/11 07:17:06 INFO mapred.JobClient:   FileSystemCounters

12/11/11 07:17:06 INFO mapred.JobClient:     FILE_BYTES_READ=99911067

12/11/11 07:17:06 INFO mapred.JobClient:     HDFS_BYTES_READ=46741458

12/11/11 07:17:06 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=286139450

12/11/11 07:17:06 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=16578538

12/11/11 07:17:06 INFO mapred.JobClient:   File Input Format Counters 

12/11/11 07:17:06 INFO mapred.JobClient:     Bytes Read=5048729

12/11/11 07:17:06 INFO mapred.JobClient:   Map-Reduce Framework

12/11/11 07:17:06 INFO mapred.JobClient:     Map output materialized bytes=22414999

12/11/11 07:17:06 INFO mapred.JobClient:     Map input records=108530

12/11/11 07:17:06 INFO mapred.JobClient:     Reduce shuffle bytes=0

12/11/11 07:17:06 INFO mapred.JobClient:     Spilled Records=2015979

12/11/11 07:17:06 INFO mapred.JobClient:     Map output bytes=20666931

12/11/11 07:17:06 INFO mapred.JobClient:     Total committed heap usage (bytes)=2057838592

12/11/11 07:17:06 INFO mapred.JobClient:     CPU time spent (ms)=0

12/11/11 07:17:06 INFO mapred.JobClient:     SPLIT_RAW_BYTES=1062

12/11/11 07:17:06 INFO mapred.JobClient:     Combine input records=0

12/11/11 07:17:06 INFO mapred.JobClient:     Reduce input records=874004

12/11/11 07:17:06 INFO mapred.JobClient:     Reduce input groups=94211

12/11/11 07:17:06 INFO mapred.JobClient:     Combine output records=0

12/11/11 07:17:06 INFO mapred.JobClient:     Physical memory (bytes) snapshot=0

12/11/11 07:17:06 INFO mapred.JobClient:     Reduce output records=94211

12/11/11 07:17:06 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=0

12/11/11 07:17:06 INFO mapred.JobClient:     Map output records=874004



4. 결과를 가지고 와서 확인하기 

hadoop@ubuntu:~/work$ ../bin/hadoop/bin/hadoop dfs -get /data/invertedindext_out ./

hadoop@ubuntu:~/work$ cd invertedindext_out/

hadoop@ubuntu:~/work/invertedindext_out$ ls

part-r-00000  _SUCCESS


hadoop@ubuntu:~/work/invertedindext_out$ more part-r-00000

! prince.txt:41794

" pg27827.txt:25391,pg27827.txt:22695,pg27827.txt:23024,pg27827.txt:23250,pg27827.txt:22637,pg27827.txt:22398,pg

27827.txt:22343,pg27827.txt:23961,pg27827.txt:24079,pg27827.txt:24142,pg27827.txt:24191,pg27827.txt:24298,pg27827.txt:

22284,pg27827.txt:22173,pg27827.txt:22062,pg27827.txt:24690,pg27827.txt:24755,pg27827.txt:21931,pg27827.txt:21873,pg27

827.txt:21842,pg27827.txt:21807,pg27827.txt:21490,pg27827.txt:21300,pg27827.txt:21243,pg27827.txt:22812,pg27827.txt:24

933,pg27827.txt:24990,pg27827.txt:25037

"'After pg1342.txt:639410

"'My pg1342.txt:638650

"'Spells pg132.txt:249299

"'TIS pg11.txt:121592

"'Tis pg1342.txt:584553,pg1342.txt:609915

"'To prince.txt:64294

"'army' pg132.txt:15601

"(1) pg132.txt:264126

"(1)". pg27827.txt:336002

"(2)". pg27827.txt:335943

"(Lo)cra" pg5000.txt:656915

"--Exactly. prince.txt:81916

"--SAID pg11.txt:143118

"13 pg132.txt:25622,pg132.txt:19470,pg132.txt:37173,pg132.txt:18165

"1490 pg5000.txt:1354328

"1498," pg5000.txt:1372794

"35" pg5000.txt:723641

"40," pg5000.txt:628271

"A pg132.txt:106978,pg132.txt:316296,pg132.txt:143678,pg132.txt:295414,pg132.txt:233970,pg132.txt:295533,pg132.tx

t:211327,prince.txt:22778,prince.txt:27294,prince.txt:20386,prince.txt:48701,prince.txt:20453,prince.txt:22765,prince.

txt:51105,pg27827.txt:338327,pg27827.txt:250594,pg27827.txt:279032,pg27827.txt:287388,pg27827.txt:286979,pg27827.txt:2

40963,pg27827.txt:338358,pg1342.txt:136267,pg1342.txt:288024,pg1342.txt:428735,pg1342.txt:522298,pg1342.txt:399633,pg1

342.txt:671942,pg1342.txt:137439,pg1342.txt:269156,pg1342.txt:101072,pg1342.txt:600412,pg1342.txt:381033,pg1342.txt:40

1449,pg30601.txt:192068,pg30601.txt:116286,pg30601.txt:63986,pg30601.txt:191918,pg30601.txt:63841

"AS-IS". pg5000.txt:1419667

"A_ pg5000.txt:690824

"Abide pg132.txt:187432

"About pg132.txt:7622,pg1342.txt:101653,pg1342.txt:130436,pg27827.txt:115885



블로그 이미지

커뉴

이 세상에서 꿈 이상으로 확실한 것을, 인간은 가지고 있는 것일까?

,