With a fresh install of Ubuntu 13.10 x86_64 version, ia32-libs will missing, their is a way to workaround it by using a previous distribution (13.04). 1. Launch Software & Updates 2. Select "Other software" and add "deb http://archive.ubuntu.com/ubuntu/ raring main restricted universe multiverse" 3. Close it and launch terminal (Ctrl+Alt+T) 4. sudo apt-get update 5. sudo apt-get install ia32-libs 6. Reboot [Reference] http://ubuntuforums.org/showthread.php?t=2182653...
Once the bugs grows quickly, and if you wanna to review the bug status, you can use product dashboard extensions to manage your bugs. How to Install? Step 1. Install bzr. On ubuntu you can use: apt-get install bzr Step 2. bzr bugzilla 4.2.x branch Step 3. bzr bugzilla 4.2.x bmo branch Step 4. move bmo branch product dashboard extension to bugzilla...
Step 1: Setup up the Language Dash home -> Language Support Step 2: Install Language and select Keyboard input as ibus Install Chinese (Traditional) Step 3: Reboot Step 4: Go to ibus ibus -> Preferences -> Input method -> Select an input method Step 5: Use ctrl + space to type Chinese ...
Use tool UNetbootin. 1. Format USB drive to FAT32 2. Download certain Ubuntu iso 3. Download UNetbootin http://unetbootin.sourceforge.net/ 4. Launch tool and select Ubuntu iso 5. Start 6. Reboot or finish ...
I have no idea why Baidu spider didn't follow robot.txt rules, finally I use iptables rules to block in and out ip address. [Iptable rules] iptables -p tcp -I INPUT -j DROP -s 119.63.192.x iptables -p tcp -I INPUT -j DROP -s 123.125.64.x iptables -p tcp -I INPUT -j DROP -s 180.76.0.x iptables -p tcp -I INPUT -j DROP -s 220.181.0.x [Reference] http://blog.indeepnight.com/2012/03/how-to-block-web-spider-or-crawler.html...