bing has bong’d

Looking through some logs I noticed a bunch of traffic coming from bing.com. Funny thing is, it is NOT people searching for something and my site coming up. It appears that bing is doing keyword searches. For example here is a list of words it has looked for so far this month:

about
adelphia
adobe
airways
cameras
cdrecord
channel
channels
cisco
citrix
clear
client
cloning
comcast
cooking
december
demand
desktop
digital
drive
drives
dtrace
dynamic
dyson
early
error
family
funny
fusion
gravelly+point
hospitals
house
january
jetblue
morgantown
mount
movies
mysql
nikon
october
opensolaris
overcurrent
passwd
password
patch
peanut
photography
photoshop
pictures
ponytail
postgresql
procmail
psrinfo
question
radio
random
replication
restoration
sendmail
server
service
should
solaris
studio
surprise
system
syswatch
table
thomas
tivoli
today
toilet
tomcat
transition
travel
trying
tvgos
update
upgrade
usairways
vacation
vmware
vsphere
weblog

The reason I can tell that it is not a person is that one, the requests are coming from a BOT. The second is that when some one actually uses BING to search for something there is additional stuff “left” on the referrer string. Is bing really that stupid about how it indexes a site? So I guess the point of this is, if you want to have your site on the top of the list with bing, just put a dictionary on your site ;-).

KDKA please fix your TVGOS signal

Found out that the problem with my TVGOS on my one TV is that KDKA doesn’t seem to “know” about Morgantown’s zip codes. So I sent them this tonight, will see what happens or if I even get a response:

Hello,
Currently I have a TV that uses Macrovision’s (Rovi’s) TVGOS system to display the TV Guide program guide. When the signal was coming from the analog WNPB PBS Station in Morgantown WV everything seemed fine. However now that I get the signal from your station on Comcast Cable in Morgantown, the TV can never find the listings for Morgantown. I found that if I put in the zip code as 15222, it will display the comcast info for Pittsburgh, however that information is not correct for Comcast in Morgantown. Is there any way you can add Morgantown’s zip code and channel info to what ever transmits that data? Specifically 26505 or 26501.

Thanks

Mediatomb and Solaris 10

Now that I rebuilt the server, it was time to put media tomb on it to share media with the PS3. To get it to compile on Solaris (fresh 05/09 Update 7 install patched with the latest security and recommended patches), there are a couple of things you have to do (most of this is from http://blogs.sun.com/constantin/entry/mediatomb_on_solaris with some additional stuff I had to do:

1. Download the latest version of the “file” program from ftp://ftp.astron.com/pub/file/ (my case the current is 5.03)
2. Unzip/untar the file
3. Configure and run make:

gzip -d file-5.03.tar.gz
tar -xvf file-5.03.tar
cd file-5.03
./configure --prefix=/usr/local/file
gmake
su - root -c "gmake install"

3. There are a bunch of other requirements for mediatomb, the easiest way to get them is to use www.blastwave.org. The packages that I installed are:
CSWbdb4
CSWbzip2
CSWcurl
CSWcurlrt
CSWexpat
CSWfaac
CSWfaad2
CSWfconfig
CSWffmpeg
CSWffmpeglib
CSWftype2
CSWgcc3corert
CSWgcc3g++rt
CSWgcrypt
CSWggettext
CSWgpgerr
CSWiconv
CSWid3lib
CSWimlib2
CSWisaexec
CSWlame
CSWliba52
CSWlibid3tag
CSWlibidn
CSWlibnet
CSWlibogg
CSWlibsdl
CSWlibssh2
CSWlibtool
CSWlibtoolrt
CSWlibx11
CSWlibxau
CSWlibxcb
CSWlibxdmcp
CSWncurses
CSWoldaprt
CSWossl
CSWossldevel
CSWosslrt
CSWosslutils
CSWpixman
CSWpng
CSWsasl
CSWsdlmixer
CSWsqlite3
CSWsqlite3dev
CSWstl4
CSWsunmath
CSWtaglibgcc
CSWtheora
CSWtiff
CSWungif
CSWvorbis
CSWxvid
CSWzlib

4. Once these are installed, you can download mediatomb and compile it. http://mediatomb.cc/pages/download

gzip -d mediatomb-0.11.0.tar.gz
tar -xvf mediatomb-0.11.0.tar
cd mediatomb-0.11.0
./configure --prefix=/mediatomb --enable-iconv-lib --with-iconv-h=/opt/csw/include --with-iconv-libs=/opt/csw/lib --enable-libmagic --with-magic-h=/usr/local/file/include --with-magic-libs=/usr/local/file/lib --with-taglib-cfg=/opt/csw/bin/taglib-config --with-curl-cfg=/opt/csw/bin/curl-config --with-sqlite3-libs=/opt/csw/lib --with-sqlite3-h=/opt/csw/include --with-search=/opt/csw --with-id3lib-h=/opt/csw/include --with-id3-libs=/opt/csw/lib
gmake

However before you can run gmake, you need to edit a couple of files. One is the src/main.cc, you need to comment out lines 128 through 141. This is not needed for Solaris. The second file to edit is a result of this:

During the compile I got an error that looked like this:

../src/url.cc:78:53: macro "curl_easy_setopt" requires 3 arguments, but only 2 given
../src/url.cc: In member function `zmm::Ref<zmm ::StringBuffer> URL::download(zmm::String, long int*, CURL*, bool, bool, bool)':
../src/url.cc:78: warning: statement is a reference, not call, to function `curl_easy_setopt'

To fix it edit the src/url.cc file and on line 78 change it from this:

curl_easy_setopt(curl_handle, CURLOPT_NOBODY);

to this :

curl_easy_setopt(curl_handle, CURLOPT_NOBODY, 1);

Then rerun gmake . Once the compile is finished, su to root and do a gmake install, it will place all the media tomb stuff in /mediatomb. (I am using a Zone on a Solaris 10 machine so / has plenty of space).

I then created a user for mediatomb to be run under, so a user and group called mediatmb were created, and all the /mediatomb directories and files were changed to be owned by mediatmb

Once that was done, login as the mediatmb user and create a script in /mediatmb/bin with the following in it:

LD_LIBRARY_PATH=/opt/csw/lib:/usr/local/file/lib:/usr/sfw/lib
export LD_LIBRARY_PATH
./mediatomb --ip x.x.x.x --port 49194 --daemon --pidfile /tmp/mediatomb.pid --logfile=/tmp/mediatomb.log

where x.x.x.x is the IP address of the machine you are running it on. In Constantin’s blog, he mentioned also using the interface, but I found that it had problems since this was a zone. So instead of using the interface, I used the IP address of the zone.

The changes I made to the config.xml in the ~/.mediatomb directory are as follows:

-bash-3.00$ diff orig-config.xml config.xml 
6a7
&gt;       &lt;account user="unixwiz" password="video"/&gt;
23c24
&lt; &lt;protocolInfo extend="no"/&gt;&lt;!-- For PS3 support change to "yes" --&gt;
---
&gt;     &lt;protocolinfo extend="yes"/&gt;&lt;!-- For PS3 support change to "yes" --&gt;
44a46
&gt;     &lt;magic -file&gt;/usr/local/file/share/misc/magic.mgc&lt;/magic&gt;
46a49,50
&gt;       &lt;map from="mpg" to="video/mpeg"/&gt;
&gt;       &lt;map from="JPG" to="image/jpeg"/&gt;
61c65
&lt; &lt;!-- &lt;map from="avi" to="video/divx"/&gt; --&gt;
---
&gt;         &lt;map from="avi" to="video/divx"/&gt;

Now all you have to do is login to the web interface and add the media, following mediatomb’s documentation.

Some interesting things I have found with it:

1. If your PS3 is on wireless it may have problems streaming mp4, but not mpeg2. Weird I know as the mpeg2 was solid streaming at 370kb/s, but it couldn’t handle the mp4. Switching to a hardwired connection fixed that problem.

2. If you happen to be a ReplayTV user [cause TiVO is a wanna be replay 😉 ] you can use MediaTomb in conjunction with DVArchive, which also runs nicely in my Solaris zone. Just point a media directory at the Local_Guide directory and mediatomb will stream every MPEG2 file in that directory to your PS3. Which is pretty damn cool.

3. The PS3 is very pickly about it’s mp4 type files. Some that I made with handbrake don’t work, but ones I did with ffmpegx worked once I put the hardwired connection in.

4. I have not tried any of the transcoding stuff. I would rather do that before and not bog down my server doing that.

5. I need to do a lot of reading on how to make folders and the such so that my collection is organized and not just all under one directory.

For those interested:
My compile environment is setup like this:

PATH=/bin:/sbin:/usr/bin:/usr/sbin:/usr/sfw/bin:/usr/ccs/bin:/usr/local/bin:/opt/csw/bin
SHELL=/bin/tcsh

Fixing Dynamic DNS

My Solaris machine that ran DHCP/DNS and Routing for my home network died tonight after having been running for over 3 and a half years no-stop. So I had to re-setup my dhcp and dns on another machine, luckly I had backed up the stuff that was on the old machine a month or so ago, but some info had changed. In particular was the Dynamic DNS that I had setup and linked with the DHCP server (I use ISC’s DHCP and DNS). So I got the backup restored on another server and everything running, but a couple of hosts would not work. Come to find out the backup I had was several months old (no problem the machine did not change that much), but what did change was my IP address to the world (It changed some time in march or april after having been the same for over 3 years).

Well I had forgot how to update the Dynamic DNS stuff so I had to go hunting. This is what I did:

1. You can update the info dynamicly using nsupdate (if you have it configured to do so, which I did). So I did the following:


#nsupdate

server 10.0.0.69

key dhcpupdate u23ove098uy2ok3n12339==

zone homenetwork.net

update delete homenetwork.net

send

update add homenetwork.net 18000 IN A 10.0.0.1

send

^D

So now that part worked, but I noticed that I screwed up one of the NS records (it had the ip with the domain) at some point. So again to delete and add a new NS record:


#nsupdate

server 10.0.0.69

key dhcpupdate u23ove098uy2ok3n12339==

update delete homenetwork.net. NS 10.0.0.69.homenetwork.net.

send

update add homenetwork.net. 86400 IN NS ns.homenetwork.net.

send

^D

So that is all fine and well, but I am used to editing the files by hand… Didn’t realize until tonight that I could actually still do that. Any one who has used DDNS from ISC will notice that in the zones directory there will be files with a .jnl attached to it for the zones that are dynamic dns enabled. Those files are binary so viewing them looks very weird. I always thought for some reason that those were “it”, and the files that I used to use were no longer used. But they are….. The old files are updated about every 15 minutes with the info that is in the jnl files. But if you want to edit the files like  you always have, but still use ddns, you can. All you need to do is “freeze” updates, edit the files and then “thaw” the zones. When you freeze the zone it will flush the info in the jnl files to the files you are used to editing. All you need to do is the following:


rndc freeze

Edit the files


rndc thaw

Your changes will now be available.