Deprecated: Function set_magic_quotes_runtime() is deprecated in /home/jovianskye/ on line 14

Warning: Cannot modify header information - headers already sent by (output started at /home/jovianskye/ in /home/jovianskye/ on line 1240
the Jovian Skye
Go to content Go to navigation Go to search

jsTimeMachine - Web Based "Time Machine" Interface · 1 April 2008 by Julian

Ok it’s very beta, but well it sort of works ;-) – no really, it does


So the story goes, Kris one of my workmates is working on an (web based) application were he needed to show 2D data over time & being such a Mac Fan Boy Enthusiast that he is, he thought why not have some sort of 3 dimensional transition of the data off into infinity ???

And the rest as they say is history. I’ve made a nice pretty background, got some sort of preview going & tidied up the interface a bit & posted it here (cause Kris has no blog, not that i look after mine all that well :P)

We’d only consider this a 0.5b release (yeah very beta). Kris has yet to work on the scaling of the contents of the panels, so if the content of the panels as they zoom towards you are % based they are fine (except text), but if it’s an image in there (as you’ll see in the demo) then no scaling occurs :(

Demo One – jsTimeMachineTable [Table based]

Demo Two – jsTimeMachinePix [Picture Gallery]


Comment [1]

Seriously Kewl Eye Candy · 30 May 2006 by Julian

Websites as graphs

So do you really want to know what your sites code really looks like? Well some cleaver dickie Aharef has made something to show you what your (X)HTML code actually looks like.

The Jovian Skye in code view is a rather interesting view of how this web page (or any page) actually looks. For us dyslexics amongst us this is often how we are able to view things.

Hacking Sphider to Weigh the Heading Tags · 19 April 2006 by Julian

These are the instructions on how to add heading weightings (the h1 to h6 tags found in web pages, just like Google does) to the Sphider Search Engine .

Once installed this will allow you to apply a weighting to the various heading tags found in web pages when indexed. (cause you’re using the heading tags right?!? ;-))

Warning! Not for Amateurs

Before you begin make sure you take the necessary precautions and backup your files & when you break it, remember that itâ??s not my fault; youâ??re the one to blame.

O.K. here we go…

Replace the following functions:

in spiderfuncs.php

& in spider.php

Add the following to the following files:

settings/config.php => at the very end:
// Relative weight of a word in heading tags
// ............................................................... added by Julz
$heading_weights = array ('h1' => 9, 'h2' => 8, 'h3' => 7, 'h4' => 6, 'h5' => 4, 'h6' => 2);
// ............................................................... added by Julz

admin/configset.php => inside the “if (isset($Submit))” right after the other weights settings:
// ............................................................... added by Julz
fwrite($fhandle, "\n\n// Relative weight of a word in heading tags\n");
fwrite($fhandle,"$"."heading_weights = array ('h1' => " . $_h1_weight . ", 'h2' => " . $_h2_weight . ", 'h3' => " . $_h3_weight . ", 'h4' => " . $_h4_weight . ", 'h5' => " . $_h5_weight . ", 'h6' => " . $_h6_weight . ");");
// ............................................................... added by Julz

admin/configset.php => just before the submit button:
<!-- // ................................. added by Julz -->
for ($i = 1; $i <= 6; $i++) { ?>
<td class="left1"><input name="_<?php echo 'h' . $i;?>_weight" type="text" id="<?php echo 'h' . $i;?>_weight" size="5" maxlength="2" value="<?php echo $heading_weights['h' . $i];?>">
<td> Relative weight of a word in <?php echo 'h' . $i;?> tags</td>
<?php } ?>
<!-- // ................................. added by Julz -->

MySQL Log Rotation with PHP · 14 February 2006 by Julian

If you want to rotate your MySQL log files (i.e. you have enabled the ”—log-bin” option in the command line or used MySQL Administrator to enable Binary Log Files) via PHP, here a little script to enable you to do it.

Using the PEAR DB libraries, here’s how you would do it:

//---- this is the time of the last log file that will be kept
$purgeTime['hour'] = 23;
$purgeTime['minute'] = 59;
$purgeTime['second'] = 50;

//---- date adjustment for the purging of the log files,
//---- in this case it is set to 2 because I want to keep
//---- 24 hours worth & my purge will take place after
//---- midnight
$purgeDateAdjustment['days'] = 2;
$purgeDateAdjustment['months'] = 0;
$purgeDateAdjustment['years'] = 0;

//---- includes PEAR DB classes
//---- includes the DB parameters

if (DB::isError($db)) { die($db->getMessage());

$purgeEvent = date("Y-m-d H:i:s",mktime($purgeTime['hour'], $purgeTime['minute'], $purgeTime['second'], date("m")-$purgeDateAdjustment['months'], date("d")-$purgeDateAdjustment['days'], date("Y")-$purgeDateAdjustment['years']));

$queryPurge = "PURGE BINARY LOGS before '$purgeEvent'";

$result = $db->query($queryPurge);

echo $queryPurge . "<br />\n";

//---- fetch a resultset of all the binary logs, this part is
//---- not necessary but show you what you have done
$queryShowMaster = 'SHOW MASTER LOGS';

$result = $db->query($queryShowMaster);

if (DB::isError($result)) { die(showErrorPage($result->getMessage()));

$numrows = $result->numRows();
$counter = 1;

for ($i = 0; $i < $numrows; $i++){ $row = $result->fetchRow(DB_FETCHMODE_ASSOC, $i); if ($counter == 1) { $oldest_kept_log = $row['Log_name']; } $counter++;

$reportMessage = "The MySQL binary logs have been rotated. The oldest log is '$oldest_kept_log'";

See also:
MySQL Database Checks with PHP

MySQL Database Checks with PHP · 7 February 2006 by Julian

I’m just in the middle of upgrading our web servers (Windows 2003 Enterprise Edition) at the place I work (can’t tell you where sorry, they don’t like blogging – yet ;-) ), & I was wanting to run an automated check of the MySQL database.

I considered running them directly via a Cron type service (prycron is a good one for windows) but thatâ??s far too manual for the number of databases we have & lacks the control I wanted. So I looked around the web for some good scripts, but nothing out there was what I wanted, so I tuned to the good old MySQL site & had a nosey around.

Scraping together a few snippets of code from various contributors from the comments section I managed to build a little script to check the MySQL databases.

Essentially all MySQL checks, analyze, optimize, repairs, dumps & log rotations can be performed by simply executing a SQL command. So the idea is just to make up a script in PHP that just executes your required SQL statements.


To perform a â??CHECKâ? of a MySQL table the syntax is;
CHECK TABLE tbl_name [, tbl_name] ... [option] ...

Simple eh?

I chose to use CHANGED when performing a check on the database during the day for any tables that were updated frequently & did an EXTENDED on all tables at 4am when I did my full backup on all tables. [Of note I found that the hour from 4am until 5am was the quietest so thatâ??s when I do my complete database backups & extended checks etc on the databases.]

As well as CHECK I also perform the ANALYZE & OPTIMIZE at the same time. ANALYZE TABLE analyzes and stores the key distribution for a table (see OPTIMIZE TABLE should be used if you have deleted a large part of a table or if you have made many changes to a table with variable-length rows (tables that have VARCHAR, VARBINARY, BLOB, or TEXT columns) (see

ANALYZE & OPTIMIZE have the option of NO_WRITE_TO_BINLOG which is useful in that it wonâ??t bloat your bin log files with all these queries

The following uses the PEAR DB libraries as any good PHP programmer would ;)

Please Note: the use of ”= =” in this script should be ”==”. I have had to make changes to the code as Textpattern (my blog software) thinks that this should be some centred text.

$sqlTablesToCheck is an array of database configuration files, for the 20+ databases that you have ;-)
$sqlTablesToCheck["druple"] = 'path_to_config_file/druple_db-prams.php';

foreach ($sqlTablesToCheck as $tableName => $configFile) {
  // ---- includes the DB parameters

  echo "<h2>$tableName</h2>\n";

  if (DB::isError($db)) {

  // ---- fetch a list of all tables in the DB
  $sqlAllTables = 'SHOW TABLES';

  $result = $db->query($sqlAllTables);

  if (DB::isError($result)) {

  $numberRows = $result->numRows();

  echo "<h3>$sqlAllTables</h3>\n";

  for ($i=0;$i < $numberRows; $i++){
    $row = $result->fetchRow(DB_FETCHMODE_ORDERED,$i);
    echo $row<sup><a href="#fn0">0</a></sup> . ", ";
    $theTablesArr[] = $row<sup><a href="#fn0">0</a></sup>;

  $theTables = implode(',',$theTablesArr);

  // ---- build a list of SQL statements to do the CHECK/ANALYZE/OPTIMIZE
  // ---- perform the full "EXTENDED"  check of the database only at 4am!
  if ($currentTime['hour'] = =  4) {
    $sqlQuerys["CHECK"] = 'CHECK TABLE ' . $theTables . ' EXTENDED';
  } else {
    $sqlQuerys["CHECK"] = 'CHECK TABLE ' . $theTables . ' CHANGED';

  $sqlQuerys["ANALYZE"] = 'ANALYZE NO_WRITE_TO_BINLOG TABLE ' . $theTables;
  $sqlQuerys["OPTIMIZE"] = 'OPTIMIZE NO_WRITE_TO_BINLOG TABLE ' . $theTables;

  // ---- perform the CHECK/ANALYZE/OPTIMIZE on the DB
  foreach ($sqlQuerys as $queryName => $eachQuery) {
    // ---- wait for 4 seconds, don't want to kill the poor server with lots of heavy queries

    $result = $db->query($eachQuery);

    if (DB::isError($result)) {

    $numberRows = $result->numRows();

    echo "<h3>$queryName</h3>";

    for ($i=0;$i < $numberRows; $i++){
      $row = $result->fetchRow(DB_FETCHMODE_ASSOC,$i);
      echo "{$row['Table']} > {$row['Op']} > {$row['Msg_text']} > {$row['Msg_type']}<br />\n";

      if (($row['Msg_type'] = = "error") || ($row['Msg_type'] = = "warning") || ($row['Msg_type'] = = "info")) {
        $cleanUpErrors[] = "{$row['Table']}   {$row['Op']}   {$row['Msg_text']}   {$row['Msg_type']}";
  // ---- reset the array to be populated again

if (!isset($cleanUpErrors)) {
    $cleanUpErrors[] = "No_errors";

Of course $cleanUpErrors[] is an array of errors (or warnings) that is generated by MySQL which you could then fir of to an email address

See also:
MySQL Log Rotation with PHP

Search Engine Shootout · 9 November 2005 by Julian

I was in the market for a [simple] search engine to use on our staff intranet the other day, so being the PHP/MySQL developer (& proponent of Open Source Software) that I was I hunted down two choices; PhpDig & Sphider.

First I tried PhpDig

So I went to the site & downloaded it. Oh Yeah!, I saw I could index PDFs & Word docs. (Kewl ;-) ). So I went about downloading & installing it. That didn’t take long, except once I tried to get it to index PDFs & Word docs.

PhpDig’s instructions seemed rather straight forward, but I was having problems getting it to index the PDFs (I thought I’d try that first, before indexing the Word docs). So RTFM, . . . no help. Then I tried to look through the forums, . . . no help, but wait . . . these forums seem to have only half the threads in them, hmm, maybe I’d better register. So I did.

Still half of the threads are missing, but wait hang on a second there, there’s . . . a . . . link. Whoa what did I discover, the software is open source & the software is free but wait the help’s gonna cost yah (BURN!). Oh did that leave a nasty taste in the mouth. Why can’t he just say, it’s NOT FREE??!!?? It’s not that I wouldn’t pay for it (I do pay for some software), I’d just like to know about the cost before I’VE INSTALLED THE BLOODY THING!

So the next decision was; am I going to keep this? the search engine was pretty good, fast easy to use & it has templates, but wait a second there’s only 300 something-odd pages that have been indexed, this site is several thousand. Ok back to RTFM, . . . no help there. CRAP!

So I went of in a search for another engine . . .

Enter Sphider

Well pretty much Sphider was as straight forward to install as PhpDig. The feature set wasnâ??t quite there yet (no Word/Excel indexing on Windows), & it was apparent that the project wasnâ??t quite as advanced as PhpDig (no templates, nor XHTML code, these are â??in the worksâ? supposedly).

Setup was quick just a few things to do, one thing missing is documentation of any substance (a problem of all OSS really). But I was up in running within an hour or so. It was good to see the use of excluded areas of the page to index (just add some comments to your include files & youâ??re away)

One big difference (other than the fact I could get it to index the whole site) was the speed in which it indexed the site, quite noticeably quicker (& the feedback on the indexing was good too).

I was very impressed with the backend tools for Sphider, simple but all that you really need.

The Verdict

Not much of a hard problem picking the winner; the fastest, the most thorough & the easiest to get help on (& the one I kept), was Sphider. Although still missing a few features Iâ??m very certain it will catch up.

Iâ??ve had Sphider running on our Intranet web site for a few weeks now & itâ??s been very successful, itâ??s interesting to view the â??Most popular searchesâ? & looking at the top keywords is useful to track the information that people are the most interested in.