No_commercial!
Crackers against Smut

Fravia's Source checking page

(How to exploit "unconnected" data - user authentication on the net)
~
By Fravia, Updated November 1997
Page severely under construction...


Well, inside every page there are treasures of hidden information.
Use an agora server with the deep command or a good ftp-mailer, like mogli (w3mail@mogli.gmd.de) with something like
get -l -a -img http://...
And you'll immediately see, without any effort, ALL hidden links existing on that same page.
You can find very useful snippets also through the simple "View document source" option of Netscape... if a page uses frames, just "spring off" them, looking at the source code that called those frames, in order to call it directly yourself without the "frame jacket" (you may work on this personally on line or, better, per email, using a good agora server).
Since the majority of the smut depots do call "hidden" pages, you just need to perform a deep "snooping" of their "allowed" pages in order to understand where else they could link to... most of the time they will really hide some valuable information, and you'll be able to exploit the "weak" side of their CGI-scripts, in order to destroy them, as explained inside my "CGI-Script reverse engineering" pages one and two

Another useful trick is to have a look at all the COUNTERS of the site you are attacking: smut dealer sites use counters heavily (in order to be listed on the various "top reviews", see the "combing" page) and counters use CGI-scripts! Some counter sites keep also track of ALL related sites of a given counter, for instance ClaimItFor, and this means that you can find "backwards" totally unrelated servers that carry information pertaining to the same site!

You may also try an old "ftp" trick: mount a directory backwards (simply delete from the URL Location window of your browser the last subdirectory, or the last two, and see if that pays! If you "come through" to the arborescence you'll have many chance to find somewhere a perl or CGI-script call that you can reverse to destroy the site... or at least you'll be able to send the whole arborescence to the newsgroups of the suckers (various alt.erotik.etcetera) laming the site out of commerce for a while, until the site owner restructures the whole subdirectories naming conventions :-)

Remeber that publishing (if you intercept or reconstruct them) some access passwords of the site on Usenet WILL NOT DAMAGE the smut site, because all these sites automatically filter any login: if two persons log in at the same time with the same login_ID and password (which inevitably happens if you get only some passwords attacking the site and then publish them on Usenet) the relative accounts will be disabled.
But if you get THE WHOLE LOT (say piping a metacommand, and publish it on the Newsgroups of the porno addicts, you'll probably push the whole smut site out of commerce, because all the "legitimate" suckers will make quite a big fuss with the smut site owner about not being able any more to leech their pictures inside it!.

By all means, even if they WILL NOT work any more, for the filtering reasons explained above, have a look at the many "smut-passwords-lists" that you'll find in the "warez" scene sites, for instance here under hacking/phreaking... some sites (not many, alas) use an AUTOMATICAL algorithm for the password creation, based on the Name string of the user! These are pretty easy to reconstruct, as you'll learn reading the +ORC's tutorial lessons and the students' essays on my "main" site that deal with all password protection schemes (and their algorithms).

User authentication on Unix type servers can allow or deny individuals access to the document tree directories on a username and password basis. There is NO correspondence between the system level usernames/passwords (in /etc/passwd) and the web server's username/password file.

When users access pages that are protected with this mechanism, they are given two prompt (username and password) to which they must respons correctly before access is allowed. Once authenticated, they can navigate from page to page without repeated authentication prompts. This works because the Web browser remembers the hostname, directory path, and name/password for subsequent retrievals. To use user authentication there must be somewhere a private htpertext username/ppassword file. By convention, this file is called
.htpasswd
User authentication is designed so that a user does not need an account on the system in order to be authenticated for access to files on the web server.
The .htpasswd file is manipulated through a program called htpasswd.
The source of this program is called htpasswd.c and is found in the support/ subdirectory of most servers.
htpasswd is invoked as follows:
% htpasswd [-c] .htpasswdusername
where username is the name of the user that should be added or edited. The -c flag, if present, tells hpasswd to create a new hypertext password file instead of editing the existing one.

If htpasswd finds the specified user, it will ask to change the user password (which must be typed twice). httpd then updates the file.

Individual authentication is accomplished using a combination of access control directives and a private .htpasswd file. Let's look at an example used to restrict access to only those individuals who know the password:
The access.conf configuration file (assuming they use DocumentRoot) must look like the following:
<Directory /usr/local/etc/httpd/htdocs>
  Options Indexes FollowSymlinks
  AllowOverride None
  AuthUserFile /usr/local/etc/httpd/conf/.htpasswd
  AuthGroupFile /dev/null
  AuthName By Secret Password Only!
  AuthRType Basic
  <Limit GET>
	require user username
  </Limit>
</Directory>
If you compare this to the general unrestricted configuration you'll see that:

The AuthUserFile directive is added to specify the absolute pathname to the hypertext password file (our target);

The AuthGroupFile directive has been added, but set to /dev/null, which by UNIX standartds indicates that the file does not exist;

The AuthName directive is added to specify the prompt to be given to the user for the username (in this case: "By Secret Password Only!");

The AUthType directive is added and sent to Basic. There is mostly no choice, since Basic is the most used authorization type available.

ALL FOUR Auth directives go outside the Limit sectioning directive

The order and allow directives were removed from the <Limit> sectioning directive and replaced with the require directive. This tells httpd to prompt for a username and password and that the username has to be username.
Using the require directive dictates the need to use the AuthUserFile, AuthGroupFile, AuthName and AuthType directives.

What happens next?
An hypertext password file is created for the username specified in the access.conf configuration file:
% htpasswd -c /usr/local/etc/httpd/conf/.htpasswd username
htpasswd prompts now for the password.

If individual authentication is needed on a directory-level basis, say to protect the directory /usr/local/etc/httpd/htdocs/xxxsmut directory, the following directives must be placed in a .htaccess file in that directory:
  AuthUserFile /usr/local/etc/httpd/conf/.htpasswd
  AuthGroupFile /dev/null
  AuthName By Secret Password Only!
  AuthRType Basic
  <Limit GET>
	require user username
  </Limit>
As we have seen above.

Our smut targets use the same system, but for GROUP authentication
The users are placed into groups and then each group is treated as a whole. There are three steps in this process:
A modification of the global ACF (access.conf) file.
The creation of an hypertext group file (.htgroup) with multiple users as members.
The double checking that ALL users in the hypertext group file are also in the hypertext password file.
Let's take a specific example:
<Directory /usr/local/etc/httpd/htdocs>
  Options Indexes FollowSymlinks
  AllowOverride None
  AuthUserFile /usr/local/etc/httpd/conf/.htpasswd
  AuthGroupFile /usr/local/etc/httpd/conf/.htgroup
  AuthName By Secret Password Only!
  AuthRType Basic
  <Limit GET>
	require group groupname
  </Limit>
</Directory>
If you compare this against our starting point you'll see that:

The AuthGroupFile directive was modified from /dev/null to point to a group password file called .htgroup. Usually this file is placed in the SAME directory as the htpertext password file.

The require directive was modified from user to group and the username was changed to groupname.

In a second step is created the file /usr/local/etc/httpd/conf/.htgroup, which contains the following group definition:
groupname: username1 username2 username3... usernameN
Usernames are separated by spaces. Multiple groups could be identified in the hypertext group file, one per line.


Web server don't have (like FTP and GOPHER) the chroot() feature, wherein access is absolutely restricted to files within the data tree.
However, Web browsers can't randomly walk across the directory structure of the system. All access is relative to the top of the docupent tree (as defined by the DocumentRoot directive). httpd does not let users change directories up past the top of the docupment tree.

However, unlike FTP and gopher (when set up properly), links from WITHIN the document tree to OUTSIDE the tree do work!!.
If a symbolic link has been written in an html page to an existing directory OUTSIDE the document tree (say inside a private user HTML directory), you may be able to pass, unless the followsymlink option inside access.conf has been disabled or changed to SymLinksIfOwnerMatch

The Server administrators usually set the AllowOverride directive to None and the Options directive to Indexes, preventing users from purposely including executable CGI scripts in the HTML stream or establishing symbolic links that lead out of the document tree.
The SymLinksIfOwnerMatch clause to the Options directive assures that users can only establish symbolic links to files and directories they own, AND THESE ON SMUT SITES are oft outside the document tree! If such links do exist and you find them, you can oft escape the document tree jacket.




Good luck, good hunt!

No_commercial!
Crackers against Smut
redAntismut main page
redA general approach
redcombing i.e. how to find the "commercial smut" sites
redcgi-script one CGI-tricks, page one
redcgi-script two CGI-tricks, page two

Fravia's main site
redhomepage red+ORC redanon redcounter measures redtools red+HCU Academy redstalking redenslavement
redstudents' essays redcocktails redlinks redsearch_forms redmail_fravia+
redIs reverse engineering legal?