One Minute Man

Discussion of challenges you have already solved
canine
Posts: 190
Joined: Sun Sep 14, 2008 5:38 am

One Minute Man

Post by canine »

Code: Select all

#!/bin/bash

url="http://www.hacker.org/challenge/misc/minuteman.php"
temp="/home/brian/.temp_compare_page_file"
temp_md5sum="$temp.md5"

curl "$url" 2>/dev/null > $temp

if [ ! -e $temp_md5sum ]
  then
    md5sum $temp > $temp_md5sum

  else
    if ! md5sum --status -c $temp_md5sum
      then
        cp $temp /home/brian/Desktop/OHSHIT

      else
        rm $temp
    fi
fi
Basically, each time this script is run, it fetches the web page in question, and compares the md5sum of the file to an existing md5sum. If it doesn't match, it copies it to my desktop and renames it OHSHIT.

After a few hours of running every minute, OHSHIT magically popped up on my desktop.

I used cron to schedule it.

Code: Select all

*  *  *  *  *  /the/path/to/the/script
MerickOWA
Posts: 182
Joined: Mon Apr 07, 2008 5:54 pm
Location: HkRkoz al KuwaiT 2019 HaCkEr 101

Post by MerickOWA »

Thats similar to what I did, only I used 'diff' and 'webget' and just put a sleep in a loop
canine
Posts: 190
Joined: Sun Sep 14, 2008 5:38 am

Post by canine »

MerickOWA wrote:Thats similar to what I did, only I used 'diff' and 'webget' and just put a sleep in a loop
TMTOWTDI
User avatar
m!nus
Posts: 202
Joined: Sat Jul 28, 2007 6:49 pm
Location: Germany

Post by m!nus »

i did a php script that just compared the string in the HTTP response.
if it is it searches for "is" and ":" and sends the word after it as solution. it almost worked when i used it except the fact that i forgot to remove the space after "is"/":" :)
falcon2424
Posts: 30
Joined: Mon Apr 30, 2007 9:35 pm

Post by falcon2424 »

I just had a cron job "wget filelocation" from a temp directory.

Then I just had to ls -al in the directory, and looked for the file with the different filesize.
the_impaler
Posts: 61
Joined: Wed Apr 30, 2008 3:31 am

Post by the_impaler »

I did it in windows - used "for /L" loop to change the time and lwp-request to get page and piped it to "findstr /V"
beginning
Posts: 1
Joined: Mon Oct 27, 2008 12:20 am

Post by beginning »

I used Unity3D (hope that doesn't disqualify me) with this code:

Code: Select all

var url : String;
var minutes = 0;

function Start () {
	while (true) {
		minutes += 1;
		print("Refreshed. Minutes: " + minutes);
		var www = new WWW(url);
		yield www;
		if (www.data != "<html><body>\nback later") {
			print (www.data);
			break;
		}
		yield new WaitForSeconds(40);
	}
}
sigi
Posts: 37
Joined: Sun Oct 26, 2008 4:58 pm

Post by sigi »

I let the following ruby program run overnight:

Code: Select all

require 'net/http'

text = nil

begin
  text = Net::HTTP.get_response('www.hacker.org','/challenge/misc/minuteman.php').body
  sleep 10
end while text =~ /back later/

File.open("answer.txt", "w") { |f| f.puts text }
Then I picked up the answer in the file the next day.

I'm exploiting the fact that the begin/end block is ran at least once (even though the 'text' variable is still 'nil' before the first iteration). This is worth mentioning because it wouldn't work if the while condition was attached to a single statement/expression.
eike42
Posts: 15
Joined: Sun Oct 26, 2008 1:17 pm

Post by eike42 »

At what time was yours? Mine was about 5am :lol:
simon
Posts: 6
Joined: Sun Nov 16, 2008 4:43 pm

Post by simon »

here my answer in sh. at least no files are needed...

Code: Select all

answer="back later" ; while [ "$answer" == "back later" ] ; do answer=`wget http://www.hacker.org/challenge/misc/minuteman.php -O- 2> /dev/null | sed 's|<.*>||g' | sed '/^$/d'` ; echo -n "." ; sleep 45 ; done ; echo "Answer: $answer"
macdachs
Posts: 6
Joined: Tue Oct 28, 2008 6:54 pm

Just Batch

Post by macdachs »

:start
wget http://www.hacker.org/challenge/misc/minuteman.php
wait 10
goto start



Wget renames automaticly the file. After 24 hours i sorted the files to there size: 4.52h
aurora
Posts: 54
Joined: Thu Feb 05, 2009 12:31 pm
Location: Bavaria, Germany

Post by aurora »

this is my php script i ran on one of my servers using nohup. in the meantime i logged out and went to sleep.

Code: Select all

#!/usr/bin/env php
<?php

set_time_limit(0);

do {
    sleep(30);

    $c = file_get_contents('http://www.hacker.org/challenge/misc/minuteman.php');
} while(strpos($c, 'back later') !== false);

print $c;

?>
Belriel
Posts: 16
Joined: Sat Dec 20, 2008 2:55 pm

Post by Belriel »

Mine was almost the same as aurora's. Running with PHP on command line after some time I had the answer written in the console. But I needed several days because the script stopped execution a few times due to network timeouts.

Code: Select all

<?php
$url = "http://www.hacker.org/challenge/misc/minuteman.php";
$text = file_get_contents($url);
while(strpos($text,"back later") != false){
	$file = @file_get_contents($url);
	if($file != false) $text = $file;
	sleep(30);
}
echo $text;
?>
rajahten
Posts: 2
Joined: Wed Feb 04, 2009 1:31 pm

Post by rajahten »

I used this python script, which i made.
Guess i could've made it with less code, but i'm still learning :D

Code: Select all

import urllib
import time

proxy = {'http': 'proxyurl'}
opener = urllib.FancyURLopener(proxy)
finished = False
compare = 'back later'

while not finished:
	f = opener.open("http://www.hacker.org/challenge/misc/minuteman.php")
	string = f.read()
	s = string[13:]
	if s != compare:
		print 'content displayed at ' + time.ctime(time.time()) + ' is: ' + s
		file = open('tempdump.txt', 'w')
		file.write('the content displayed at ' + time.ctime(time.time()) + ' is: ' + s);
		file.close()
		finished = True
	else:
		time.sleep(58)
masgo
Posts: 2
Joined: Sat Nov 29, 2008 3:29 pm

Post by masgo »

i let cron start a bash-script every minute

the script got the page with wget and then did a diff with the default "come back later" page. is the diff wasnt equal the output of diff was written to a file.
Post Reply