I2P Address: [http://git.idk.i2p]

Skip to content
Snippets Groups Projects
Commit b7bf431f authored by jrandom's avatar jrandom Committed by zzz
Browse files

[these are not the droids you are looking for]

parent 7f432122
No related branches found
No related tags found
No related merge requests found
Showing
with 3123 additions and 0 deletions
Syndie is a new effort to build a user friendly secure blogging tool, exploiting the capabilities offered by anonymity and security systems such as [link schema="web" location="http://www.i2p.net/"]I2P[/link], [link schema="web" location="http://tor.eff.org/"]TOR[/link], [link schema="web" location="http://www.freenetproject.org/"]Freenet[/link], [link schema="web" location="http://www.mnetproject.org/"]MNet[/link], and others. Abstracting away the content distribution side, Syndie allows people to [b]build content and communities[/b] that span technologies rather than tying oneself down to the ups and downs of any particular network.
[cut][/cut]Syndie is working to take the technologies of the security, anonymity, and cryptography worlds and merge them with the simplicity and user focus of the blogging world. From the user's standpoint, you could perhaps view Syndie as a distributed [link schema="web" location="http://www.livejournal.com"]LiveJournal[/link], while technically Syndie is much, much simpler.
[b]How Syndie works[/b][hr][/hr]The [i]magic[/i] behind Syndie's abstraction is to ignore any content distribution issues and merely assume data moves around as necessary. Each Syndie instance runs agains the filesystem, verifying and indexing blogs and offering up what it knows to the user through a web interface. The core idea in Syndie, therefore, is the [b]archive[/b]- a collection of blogs categorized and ready for consumption.
Whenever someone reads or posts to a Syndie instance, it is working with the [b]local archive[/b]. However, as Syndie's development progresses, people will be able to read [b]remote archives[/b] - pulling the archive summary from an I2P [i]eepsite[/i], TOR [i]hosted service[/i], Freenet [i]Freesite[/i], MNet [i]key[/i], or (with a little less glamor) usenet, filesharing apps, or the web. The first thing Syndie needs to use a remote archive is the archive's index - a plain text file summarizing what the archive contains ([attachment id="0"]an example[/attachment]). From that, Syndie will let the user browse through the blogs, pulling the individual blog posts into the local archive when necessary.
[b]Posting[/b][hr][/hr]Creating and posting to blogs with Syndie is trivial - simply log in to Syndie, click on the [i]Post[/i] button, and fill out the form offered. Syndie handles all of the encryption and formatting details - packaging up the post with any attached files into a single signed, compressed, and potentially encrypted bundle, storing it in the local archive and capable of being shared with other Syndie users. Every blog is identified by its public key behind the scenes, so there is no need for a central authority to require that your blogs are all named uniquely or any other such thing.
While each blog is run by a single author, they can in turn allow other authors to post to the blog while still letting readers know that the post is authorized (though created by a different author). Of course, if multiple people wanted to run a single blog and make it look like only one person wrote it, they could share the blog's private keys.
[b]Tags[/b][hr][/hr]Following the lessons from the last few years, every Syndie entry has any number of tags associated with it by the author, allowing trivial categorization and filtering.
[b]Hosting[/b][hr][/hr]While in many scenarios it is best for people to run Syndie locally on their machine, Syndie is a fully multiuser system so anyone can be a Syndie hosting provider by simply exposing the web interface to the public. The Syndie host's operator can password protect the blog registration interface so only authorized people can create a blog, and the operator can technically go through and delete blog posts or even entire blogs from their local archive. A public Syndie host can be a general purpose blog repository, letting anyone sign up (following the blogger and geocities path), be a more community oriented blog repository, requiring people to introduce you to the host to sign up (following the livejournal/orkut path), be a more focused blog repository, requiring posts to stay within certain guidelines (following the indymedia path), or even to fit specialized needs by picking and choosing among the best blogs and posts out there, offering the operator's editorial flare into a comprehensive collection.
[b]Syndication[/b][hr][/hr]By itself, Syndie is a nice blogging community system, but its real strength as a tool for individual and community empowerment comes when blogs are shared. While Syndie does not aim to be a content distribution network, it does want to exploit them to allow those who require their message to get out to do so. By design, syndicating Syndie can be done with some of the most basic tools - simply pass around the self authenticating files written to the archive and you're done. The archive itself is organized so that you can expose it as an indexed directory in some webserver and let people wget against it, picking to pull individual posts, all posts within a blog, all posts since a given date, or all posts in all blogs. With a very small shell script, you could parse the plain text archive summary to pull posts by size and tag as well. People could offer up their archives as rsync repositories or package up tarballs/zipfiles of blogs or entries - simply grabbing them and extracting them into your local Syndie archive would instantly give you access to all of the content therein.
Of course, manual syndication as described above has... limits. When appropriate, Syndie will tie in to content syndication systems such as [link schema="eep" location="http://feedspace.i2p/"]Feedspace[/link] (or even good ol' Usenet) to automatically import (and export) posts. Integration with content distribution networks like Freenet and MNet will allow the user to periodically grab a published archive index and pull down blogs as necessary. Posting archives and blogs to those networks will be done trivially as well, though they do still depend upon a polling paradigm.
[b]SML[/b][hr][/hr]Syndie is meant to work securely with any browser regardless of the browser's security. Blog entries are written in [b]SML[/b] [i](Syndie or Secure Markup Language)[/i] with a bbcode-linke syntax, extended to exploit some of Syndie's capabilities and context. In addition to the SML content in a blog entry, there can be any number of attachments, references to other blogs/posts/tags, nym<->public key mappings (useful for I2P host distribution), references to archives of blogs (on eepsites, freesites, etc), links to various resources, and more.
[b]Future[/b][hr][/hr]Down the road, there are lots of things to improve with Syndie. The interface, of course, is critical, as are tools for SML authoring and improvements to SML itself to offer a more engaging user experience. Integration with a search engine like Lucene would allow full text search through entire archives, and Atom/RSS interfaces would allow trivial import and export to existing clients. Even further, blogs could be transparently encrypted, allowing only authorized users (those with the key) to read entries posted to them (or even know what attachments are included). Integration with existing blogging services (such as [link schema="web" location="http://www.anonyblog.com"]anonyblog[/link], [link schema="web" location="http://blo.gs"]blo.gs[/link], and [link schema="web" location="http://livejournal.com"]livejournal[/link]) may also be explored. Of course, bundling with I2P and other anonymity, security, and community systems will be pursued.
[b]Who/where/when/why[/b][hr][/hr]The base Syndie system was written in a few days by [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" archive0="eep://dev.i2p/~jrandom" archive1="http://dev.i2p.net/~jrandom" archive2="mailto://jrandom@i2p.net"][/blog], though comes out of discussions with [link schema="eep" location="http://frosk.i2p"]Frosk[/link] and many others in the I2P community. Yes, this is an incarnation of [b]MyI2P[/b] (or for those who remember jrand0m's flog, [b]Flogger[/b]).
All of the Syndie code is of course open source and released into the public domain (the [i]real[/i] "free as in freedom"), though it does use some BSD licensed cryptographic routines and an Apache licensed file upload component. Contributions of code are very much welcome - the source is located within the [link schema="web" location="http://www.i2p.net/cvs"]I2P codebase[/link]. Of course, those who cannot or choose not to contribute code are encouraged to [b]use[/b] Syndie - create a blog, create some content, read some content! For those who really want to though, financial contributions to the Syndie development effort can be channeled through the [link schema="web" location="http://www.i2p.net/donate"]I2P fund[/link] (donations for Syndie are distributed to Syndie developers from time to time).
The "why" of Syndie is a much bigger question, though is hopefully self-evident. We need kickass anonymity-aware client applications so that we can get better anonymity (since without kickass clients, we don't have many users). We also need kickass tools for safe blogging, since there are limits to the strength offered by low latency anonymity systems like I2P and TOR - Syndie goes beyond them to offer an interface to mid and high latency anonymous systems while exploiting their capabilities for fast and efficient syndication.
Oh, and jrandom also lost his blog's private key, so needed something to blog with again.
To install this base instance:
mkdir lib
cp ../lib/i2p.jar lib/
cp ../lib/commons-el.jar lib/
cp ../lib/commons-logging.jar lib/
cp ../lib/jasper-compiler.jar lib/
cp ../lib/jasper-runtime.jar lib/
cp ../lib/javax.servlet.jar lib/
cp ../lib/jbigi.jar lib/
cp ../lib/org.mortbay.jetty.jar lib/
cp ../lib/xercesImpl.jar lib/
To run it:
sh run.sh
firefox http://localhost:7653/syndie/
You can share your archive at http://localhost:7653/ so
that people can syndicate off you via
cd archive ; wget -m -nH http://yourmachine:7653/
You may want to add a password on the registration form
so that you have control over who can create blogs via /syndie/.
To do so, set the password in the run.sh script.
Windows users:
write your own instructions. We're alpha, here ;)
[cut]A brief glance at SML[/cut]
[b]General rules[/b]
Newlines are newlines are newlines. If you include a newline in your SML, you'll get a newline in the rendered HTML.
All < and > characters are replaced by their HTML entity counterparts.
All SML tags are enclosed with [[ and ]] (e.g. [[b]]bold stuff[[/b]]). ([[ and ]] characters are quoted by [[[[ and ]]]], respectively)
Nesting SML tags is [b]not[/b] currently supported (though will be at a later date).
All SML tags must have a beginning and end tag (even for ones without any 'body', such as [[hr]][[/hr]]). This restriction may be removed later.
Simple formatting tags behave as expected: [[b]], [[i]], [[u]], [[h1]] through [[h5]], [[hr]], [[pre]].
[hr][/hr]
[b]Tag details[/b]
* To cut an entry so that the summary is before while the details are afterwards:
[[cut]]more inside...[[/cut]]
* To load an attachment as an image with "syndie's logo" as the alternate text:
[[img attachment="0"]]syndie's logo[[/img]]
* To add a download link to an attachment:
[[attachment id="0"]]anchor text[[/img]]
* To quote someone:
[[quote author="who you are quoting" location="blog://ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=/1234567890"]]stuff they said[[/quote]]
* To sample some code:
[[code location="eep://dev.i2p/cgi-bin/cvsweb.cgi/i2p/index.html"]]<html>[[/code]]
* To link to a [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogentry="1124402137773" archive0="eep://dev.i2p/~jrandom/archive" archive1="irc2p://jrandom@irc.postman.i2p/#i2p"]bitchin' blog[/blog]:
[[blog name="the blogs name" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogtag="tag" blogentry="123456789" archive0="eep://dev.i2p/~jrandom/archive/" archive1="freenet://SSK@blah/archive//"]]description of the blog[[/blog]]. blogentry and blogtag are optional and there can be any number of archiveN locations specified.
* To link to an [link schema="eep" location="http://dev.i2p/"]external resource[/link]:
[[link schema="eep" location="http://dev.i2p/"]]link to it[[/link]].
[i]The schema should be a network selection tool, such as "eep" for an eepsite, "tor" for a tor hidden service, "web" for a normal website, "freenet" for a freenet key, etc. The local user's Syndie configuration should include information necessary for the user to access the content referenced through the given schemas.[/i]
* To pass an [address name="dev.i2p" schema="eep" location="NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA"]addressbook entry[/address]:
[[address name="dev.i2p" schema="eep" location="NF2...AAAA"]]add it[[/address]].
<?xml version="1.0" encoding="UTF-8"?>
<project basedir="." default="all" name="syndie">
<target name="all" depends="clean, build" />
<target name="build" depends="builddep, jar" />
<target name="builddep">
<ant dir="../../jetty/" target="build" />
<ant dir="../../../core/java/" target="build" />
<!-- ministreaming will build core -->
</target>
<target name="compile">
<mkdir dir="./build" />
<mkdir dir="./build/obj" />
<javac
srcdir="./src"
debug="true" deprecation="on" source="1.3" target="1.3"
destdir="./build/obj"
classpath="../../../core/java/build/i2p.jar:../../jetty/jettylib/org.mortbay.jetty.jar" />
</target>
<target name="jar" depends="builddep, compile">
<jar destfile="./build/syndie.jar" basedir="./build/obj" includes="**/*.class">
<manifest>
<attribute name="Main-Class" value="net.i2p.syndie.CLI" />
<attribute name="Class-Path" value="i2p.jar" />
</manifest>
</jar>
<ant target="war" />
</target>
<target name="war" depends="builddep, compile, precompilejsp">
<war destfile="../syndie.war" webxml="../jsp/web-out.xml">
<fileset dir="../jsp/" includes="**/*" excludes=".nbintdb, web.xml, web-out.xml, web-fragment.xml, **/*.java, **/*.jsp" />
<classes dir="./build/obj" />
</war>
</target>
<target name="precompilejsp">
<delete dir="../jsp/WEB-INF/" />
<delete file="../jsp/web-fragment.xml" />
<delete file="../jsp/web-out.xml" />
<mkdir dir="../jsp/WEB-INF/" />
<mkdir dir="../jsp/WEB-INF/classes" />
<!-- there are various jspc ant tasks, but they all seem a bit flakey -->
<java classname="org.apache.jasper.JspC" fork="true" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-compiler.jar" />
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="../../jetty/jettylib/ant.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
<arg value="-d" />
<arg value="../jsp/WEB-INF/classes" />
<arg value="-p" />
<arg value="net.i2p.syndie.jsp" />
<arg value="-webinc" />
<arg value="../jsp/web-fragment.xml" />
<arg value="-webapp" />
<arg value="../jsp/" />
</java>
<javac debug="true" deprecation="on" source="1.3" target="1.3"
destdir="../jsp/WEB-INF/classes/" srcdir="../jsp/WEB-INF/classes" includes="**/*.java" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
</javac>
<copy file="../jsp/web.xml" tofile="../jsp/web-out.xml" />
<loadfile property="jspc.web.fragment" srcfile="../jsp/web-fragment.xml" />
<replace file="../jsp/web-out.xml">
<replacefilter token="&lt;!-- precompiled servlets --&gt;" value="${jspc.web.fragment}" />
</replace>
</target>
<target name="javadoc">
<mkdir dir="./build" />
<mkdir dir="./build/javadoc" />
<javadoc
sourcepath="./src:../../../core/java/src" destdir="./build/javadoc"
packagenames="*"
use="true"
splitindex="true"
windowtitle="syndie" />
</target>
<target name="clean">
<delete dir="./build" />
</target>
<target name="cleandep" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
<target name="distclean" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
</project>
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.text.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Store blog info in the local filesystem.
*
* Entries are stored under:
* $rootDir/$h(blogKey)/$entryId.snd (the index lists them as YYYYMMDD_n_jKB)
* Blog info is stored under:
* $rootDir/$h(blogKey)/meta.snm
* Archive summary is stored under
* $rootDir/archive.txt
* Any key=value pairs in
* $rootDir/archiveHeaders.txt
* are injected into the archive.txt on regeneration.
*
* When entries are loaded for extraction/verification/etc, their contents are written to
* $cacheDir/$h(blogKey)/$entryId/ (e.g. $cacheDir/$h(blogKey)/$entryId/entry.sml)
*/
public class Archive {
private I2PAppContext _context;
private File _rootDir;
private File _cacheDir;
private Map _blogInfo;
private ArchiveIndex _index;
private EntryExtractor _extractor;
public static final String METADATA_FILE = "meta.snm";
public static final String INDEX_FILE = "archive.txt";
public static final String HEADER_FILE = "archiveHeaders.txt";
private static final FilenameFilter _entryFilenameFilter = new FilenameFilter() {
public boolean accept(File dir, String name) { return name.endsWith(".snd"); }
};
public Archive(I2PAppContext ctx, String rootDir, String cacheDir) {
_context = ctx;
_rootDir = new File(rootDir);
if (!_rootDir.exists())
_rootDir.mkdirs();
_cacheDir = new File(cacheDir);
if (!_cacheDir.exists())
_cacheDir.mkdirs();
_blogInfo = new HashMap();
_index = null;
_extractor = new EntryExtractor(ctx);
reloadInfo();
}
public void reloadInfo() {
File f[] = _rootDir.listFiles();
List info = new ArrayList();
for (int i = 0; i < f.length; i++) {
if (f[i].isDirectory()) {
File meta = new File(f[i], METADATA_FILE);
if (meta.exists()) {
BlogInfo bi = new BlogInfo();
try {
bi.load(new FileInputStream(meta));
info.add(bi);
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
}
synchronized (_blogInfo) {
_blogInfo.clear();
for (int i = 0; i < info.size(); i++) {
BlogInfo bi = (BlogInfo)info.get(i);
_blogInfo.put(bi.getKey().calculateHash(), bi);
}
}
}
public BlogInfo getBlogInfo(BlogURI uri) {
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(uri.getKeyHash());
}
}
public BlogInfo getBlogInfo(Hash key) {
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(key);
}
}
public void storeBlogInfo(BlogInfo info) {
if (!info.verify(_context)) {
System.err.println("Not storing the invalid blog " + info);
return;
}
synchronized (_blogInfo) {
_blogInfo.put(info.getKey().calculateHash(), info);
}
try {
File blogDir = new File(_rootDir, info.getKey().calculateHash().toBase64());
blogDir.mkdirs();
File blogFile = new File(blogDir, "meta.snm");
FileOutputStream out = new FileOutputStream(blogFile);
info.write(out);
out.close();
System.out.println("Blog info written to " + blogFile.getPath());
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public List listBlogs() {
synchronized (_blogInfo) {
return new ArrayList(_blogInfo.values());
}
}
private File getEntryDir(File entryFile) {
String name = entryFile.getName();
if (!name.endsWith(".snd")) throw new RuntimeException("hmm, why are we trying to get an entry dir for " + entryFile.getAbsolutePath());
String blog = entryFile.getParentFile().getName();
File blogDir = new File(_cacheDir, blog);
return new File(blogDir, name.substring(0, name.length()-4));
//return new File(entryFile.getParentFile(), "." + name.substring(0, name.length()-4));
}
/**
* Expensive operation, reading all entries within the blog and parsing out the tags.
* Whenever possible, query the index instead of the archive
*
*/
public List listTags(Hash blogKeyHash) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blogKeyHash);
if (info == null)
return rv;
File blogDir = new File(_rootDir, Base64.encode(blogKeyHash.getData()));
File entries[] = blogDir.listFiles(_entryFilenameFilter);
for (int j = 0; j < entries.length; j++) {
try {
File entryDir = getEntryDir(entries[j]);
if (!entryDir.exists()) {
if (!extractEntry(entries[j], entryDir, info)) {
System.err.println("Entry " + entries[j].getPath() + " is not valid");
continue;
}
}
EntryContainer entry = getCachedEntry(entryDir);
String tags[] = entry.getTags();
for (int t = 0; t < tags.length; t++) {
if (!rv.contains(tags[t])) {
System.out.println("Found a new tag in cached " + entry.getURI() + ": " + tags[t]);
rv.add(tags[t]);
}
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
} // end iterating over the entries
return rv;
}
/**
* Extract the entry to the given dir, returning true if it was verified properly
*
*/
private boolean extractEntry(File entryFile, File entryDir, BlogInfo info) throws IOException {
if (!entryDir.exists())
entryDir.mkdirs();
boolean ok = _extractor.extract(entryFile, entryDir, null, info);
if (!ok) {
File files[] = entryDir.listFiles();
for (int i = 0; i < files.length; i++)
files[i].delete();
entryDir.delete();
}
return ok;
}
private EntryContainer getCachedEntry(File entryDir) {
return new CachedEntry(entryDir);
}
public EntryContainer getEntry(BlogURI uri) { return getEntry(uri, null); }
public EntryContainer getEntry(BlogURI uri, SessionKey blogKey) {
List entries = listEntries(uri, null, blogKey);
if (entries.size() > 0)
return (EntryContainer)entries.get(0);
else
return null;
}
public List listEntries(BlogURI uri, String tag, SessionKey blogKey) {
return listEntries(uri.getKeyHash(), uri.getEntryId(), tag, blogKey);
}
public List listEntries(Hash blog, long entryId, String tag, SessionKey blogKey) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blog);
if (info == null)
return rv;
File blogDir = new File(_rootDir, blog.toBase64());
File entries[] = blogDir.listFiles(_entryFilenameFilter);
if (entries == null)
return rv;
for (int i = 0; i < entries.length; i++) {
try {
EntryContainer entry = null;
if (blogKey == null) {
// no key, cache.
File entryDir = getEntryDir(entries[i]);
if (!entryDir.exists()) {
if (!extractEntry(entries[i], entryDir, info)) {
System.err.println("Entry " + entries[i].getPath() + " is not valid");
continue;
}
}
entry = getCachedEntry(entryDir);
} else {
// we have an explicit key - no caching
entry = new EntryContainer();
entry.load(new FileInputStream(entries[i]));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
System.err.println("Keyed entry " + entries[i].getPath() + " is not valid");
continue;
}
entry.parseRawData(_context, blogKey);
entry.setCompleteSize((int)entries[i].length());
}
if (entryId >= 0) {
if (entry.getURI().getEntryId() == entryId) {
rv.add(entry);
return rv;
}
} else if (tag != null) {
String tags[] = entry.getTags();
for (int j = 0; j < tags.length; j++) {
if (tags[j].equals(tag)) {
rv.add(entry);
System.out.println("cached entry matched requested tag [" + tag + "]: " + entry.getURI());
break;
}
}
} else {
System.out.println("cached entry is ok and no id or tag was requested: " + entry.getURI());
rv.add(entry);
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
return rv;
}
public boolean storeEntry(EntryContainer container) {
BlogURI uri = container.getURI();
BlogInfo info = getBlogInfo(uri);
if (info == null) {
System.out.println("no blog metadata for the uri " + uri);
return false;
}
if (!container.verifySignature(_context, info)) {
System.out.println("Not storing the invalid blog entry at " + uri);
return false;
} else {
//System.out.println("Signature is valid: " + container.getSignature() + " for info " + info);
}
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
container.write(baos, true);
File blogDir = new File(_rootDir, uri.getKeyHash().toBase64());
blogDir.mkdirs();
byte data[] = baos.toByteArray();
File entryFile = new File(blogDir, getEntryFilename(uri.getEntryId()));
FileOutputStream out = new FileOutputStream(entryFile);
out.write(data);
out.close();
container.setCompleteSize(data.length);
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public static String getEntryFilename(long entryId) { return entryId + ".snd"; }
private static SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd");
public static String getIndexName(long entryId, int numBytes) {
try {
synchronized (_dateFmt) {
String yy = _dateFmt.format(new Date(entryId));
long begin = _dateFmt.parse(yy).getTime();
long n = entryId - begin;
int kb = numBytes / 1024;
return yy + '_' + n + '_' + kb + "KB";
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return "UNKNOWN";
} catch (ParseException pe) {
pe.printStackTrace();
return "UNKNOWN";
}
}
public static long getEntryIdFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int endYY = entryIndexName.indexOf('_');
if (endYY <= 0) return -1;
int endN = entryIndexName.indexOf('_', endYY+1);
if (endN <= 0) return -1;
String yy = entryIndexName.substring(0, endYY);
String n = entryIndexName.substring(endYY+1, endN);
try {
synchronized (_dateFmt) {
long dayBegin = _dateFmt.parse(yy).getTime();
long dayEntry = Long.parseLong(n);
return dayBegin + dayEntry;
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
} catch (ParseException pe) {
pe.printStackTrace();
}
return -1;
}
public static int getSizeFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int beginSize = entryIndexName.lastIndexOf('_');
if ( (beginSize <= 0) || (beginSize >= entryIndexName.length()-3) )
return -1;
try {
String sz = entryIndexName.substring(beginSize+1, entryIndexName.length()-2);
return Integer.parseInt(sz);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
}
return -1;
}
public ArchiveIndex getIndex() {
if (_index == null)
regenerateIndex();
return _index;
}
public File getArchiveDir() { return _rootDir; }
public File getIndexFile() { return new File(_rootDir, INDEX_FILE); }
public void regenerateIndex() {
reloadInfo();
_index = ArchiveIndexer.index(_context, this);
try {
PrintWriter out = new PrintWriter(new FileWriter(new File(_rootDir, INDEX_FILE)));
out.println(_index.toString());
out.flush();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
package net.i2p.syndie;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Dig through the archive to build an index
*/
class ArchiveIndexer {
private static final int RECENT_BLOG_COUNT = 10;
private static final int RECENT_ENTRY_COUNT = 10;
public static ArchiveIndex index(I2PAppContext ctx, Archive source) {
LocalArchiveIndex rv = new LocalArchiveIndex();
rv.setGeneratedOn(ctx.clock().now());
File rootDir = source.getArchiveDir();
File headerFile = new File(rootDir, Archive.HEADER_FILE);
if (headerFile.exists()) {
try {
BufferedReader in = new BufferedReader(new InputStreamReader(new FileInputStream(headerFile)));
String line = null;
while ( (line = in.readLine()) != null) {
StringTokenizer tok = new StringTokenizer(line, ":");
if (tok.countTokens() == 2)
rv.setHeader(tok.nextToken(), tok.nextToken());
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
// things are new if we just received them in the last day
long newSince = ctx.clock().now() - 24*60*60*1000;
rv.setVersion(Version.INDEX_VERSION);
/** 0-lowestEntryId --> blog Hash */
Map blogsByAge = new TreeMap();
/** 0-entryId --> BlogURI */
Map entriesByAge = new TreeMap();
List blogs = source.listBlogs();
rv.setAllBlogs(blogs.size());
int newEntries = 0;
int allEntries = 0;
long newSize = 0;
long totalSize = 0;
int newBlogs = 0;
for (int i = 0; i < blogs.size(); i++) {
BlogInfo cur = (BlogInfo)blogs.get(i);
Hash key = cur.getKey().calculateHash();
String keyStr = Base64.encode(key.getData());
File blogDir = new File(rootDir, Base64.encode(key.getData()));
File metaFile = new File(blogDir, Archive.METADATA_FILE);
long metadate = metaFile.lastModified();
List entries = source.listEntries(key, -1, null, null);
System.out.println("Entries under " + key + ": " + entries);
/** tag name --> ordered map of entryId to EntryContainer */
Map tags = new TreeMap();
for (int j = 0; j < entries.size(); j++) {
EntryContainer entry = (EntryContainer)entries.get(j);
entriesByAge.put(new Long(0-entry.getURI().getEntryId()), entry.getURI());
allEntries++;
totalSize += entry.getCompleteSize();
String entryTags[] = entry.getTags();
for (int t = 0; t < entryTags.length; t++) {
if (!tags.containsKey(entryTags[t])) {
tags.put(entryTags[t], new TreeMap());
}
Map entriesByTag = (Map)tags.get(entryTags[t]);
entriesByTag.put(new Long(0-entry.getURI().getEntryId()), entry);
System.out.println("Entries under tag " + entryTags[t] + ":" + entriesByTag.values());
}
if (entry.getURI().getEntryId() >= newSince) {
newEntries++;
newSize += entry.getCompleteSize();
}
}
long lowestEntryId = -1;
for (Iterator iter = tags.keySet().iterator(); iter.hasNext(); ) {
String tagName = (String)iter.next();
Map tagEntries = (Map)tags.get(tagName);
long highestId = -1;
if (tagEntries.size() <= 0) break;
Long id = (Long)tagEntries.keySet().iterator().next();
highestId = 0 - id.longValue();
rv.addBlog(key, tagName, highestId);
for (Iterator entryIter = tagEntries.values().iterator(); entryIter.hasNext(); ) {
EntryContainer entry = (EntryContainer)entryIter.next();
String indexName = Archive.getIndexName(entry.getURI().getEntryId(), entry.getCompleteSize());
rv.addBlogEntry(key, tagName, indexName);
if (!entryIter.hasNext())
lowestEntryId = entry.getURI().getEntryId();
}
}
if (lowestEntryId > newSince)
newBlogs++;
blogsByAge.put(new Long(0-lowestEntryId), key);
}
rv.setAllEntries(allEntries);
rv.setNewBlogs(newBlogs);
rv.setNewEntries(newEntries);
rv.setTotalSize(totalSize);
rv.setNewSize(newSize);
int i = 0;
for (Iterator iter = blogsByAge.keySet().iterator(); iter.hasNext() && i < RECENT_BLOG_COUNT; i++) {
Long when = (Long)iter.next();
Hash key = (Hash)blogsByAge.get(when);
rv.addNewestBlog(key);
}
i = 0;
for (Iterator iter = entriesByAge.keySet().iterator(); iter.hasNext() && i < RECENT_ENTRY_COUNT; i++) {
Long when = (Long)iter.next();
BlogURI uri = (BlogURI)entriesByAge.get(when);
rv.addNewestEntry(uri);
}
return rv;
}
}
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*
*/
public class BlogManager {
private I2PAppContext _context;
private static BlogManager _instance;
private File _blogKeyDir;
private File _privKeyDir;
private File _archiveDir;
private File _userDir;
private File _cacheDir;
private Archive _archive;
static {
String rootDir = I2PAppContext.getGlobalContext().getProperty("syndie.rootDir");
if (rootDir == null)
rootDir = System.getProperty("user.home");
rootDir = rootDir + File.separatorChar + ".syndie";
_instance = new BlogManager(I2PAppContext.getGlobalContext(), rootDir);
}
public static BlogManager instance() { return _instance; }
public BlogManager(I2PAppContext ctx, String rootDir) {
_context = ctx;
File root = new File(rootDir);
root.mkdirs();
_blogKeyDir = new File(root, "blogkeys");
_privKeyDir = new File(root, "privkeys");
String archiveDir = _context.getProperty("syndie.archiveDir");
if (archiveDir != null)
_archiveDir = new File(archiveDir);
else
_archiveDir = new File(root, "archive");
_userDir = new File(root, "users");
_cacheDir = new File(root, "cache");
_blogKeyDir.mkdirs();
_privKeyDir.mkdirs();
_archiveDir.mkdirs();
_cacheDir.mkdirs();
_userDir.mkdirs();
_archive = new Archive(ctx, _archiveDir.getAbsolutePath(), _cacheDir.getAbsolutePath());
_archive.regenerateIndex();
}
public BlogInfo createBlog(String name, String description, String contactURL, String archives[]) {
return createBlog(name, null, description, contactURL, archives);
}
public BlogInfo createBlog(String name, SigningPublicKey posters[], String description, String contactURL, String archives[]) {
Object keys[] = _context.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
try {
FileOutputStream out = new FileOutputStream(new File(_privKeyDir, Base64.encode(pub.calculateHash().getData()) + ".priv"));
pub.writeBytes(out);
priv.writeBytes(out);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
return createInfo(pub, priv, name, posters, description, contactURL, archives, 0);
}
public BlogInfo createInfo(SigningPublicKey pub, SigningPrivateKey priv, String name, SigningPublicKey posters[],
String description, String contactURL, String archives[], int edition) {
Properties opts = new Properties();
opts.setProperty("Name", name);
opts.setProperty("Description", description);
opts.setProperty("Edition", Integer.toString(edition));
opts.setProperty("ContactURL", contactURL);
for (int i = 0; archives != null && i < archives.length; i++)
opts.setProperty("Archive." + i, archives[i]);
BlogInfo info = new BlogInfo(pub, posters, opts);
info.sign(_context, priv);
_archive.storeBlogInfo(info);
return info;
}
public Archive getArchive() { return _archive; }
public List listMyBlogs() {
File files[] = _privKeyDir.listFiles();
List rv = new ArrayList();
for (int i = 0; i < files.length; i++) {
if (files[i].isFile() && !files[i].isHidden()) {
try {
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(new FileInputStream(files[i]));
BlogInfo info = _archive.getBlogInfo(pub.calculateHash());
if (info != null)
rv.add(info);
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (DataFormatException dfe) {
dfe.printStackTrace();
}
}
}
return rv;
}
public SigningPrivateKey getMyPrivateKey(BlogInfo blog) {
if (blog == null) return null;
File keyFile = new File(_privKeyDir, Base64.encode(blog.getKey().calculateHash().getData()) + ".priv");
try {
FileInputStream in = new FileInputStream(keyFile);
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(in);
SigningPrivateKey priv = new SigningPrivateKey();
priv.readBytes(in);
return priv;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
}
}
public String login(User user, String login, String pass) {
File userFile = new File(_userDir, Base64.encode(_context.sha().calculateHash(login.getBytes()).getData()));
System.out.println("Attempting to login to " + login + " w/ pass = " + pass
+ ": file = " + userFile.getAbsolutePath() + " passHash = "
+ Base64.encode(_context.sha().calculateHash(pass.getBytes()).getData()));
if (userFile.exists()) {
try {
Properties props = new Properties();
BufferedReader in = new BufferedReader(new FileReader(userFile));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if (split <= 0) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
props.setProperty(key.trim(), val.trim());
}
return user.login(login, pass, props);
} catch (IOException ioe) {
ioe.printStackTrace();
return "Error logging in - corrupt userfile";
}
} else {
return "User does not exist";
}
}
/** hash of the password required to register and create a new blog (null means no password required) */
public String getRegistrationPassword() {
String pass = _context.getProperty("syndie.registrationPassword");
if ( (pass == null) || (pass.trim().length() <= 0) ) return null;
return pass;
}
public void saveUser(User user) {
if (!user.getAuthenticated()) return;
String userHash = Base64.encode(_context.sha().calculateHash(user.getUsername().getBytes()).getData());
File userFile = new File(_userDir, userHash);
FileWriter out = null;
try {
out = new FileWriter(userFile);
out.write("password=" + user.getHashedPassword() + "\n");
out.write("blog=" + user.getBlog().toBase64() + "\n");
out.write("lastid=" + user.getMostRecentEntry() + "\n");
out.write("lastmetaedition=" + user.getLastMetaEntry() + "\n");
out.write("lastlogin=" + user.getLastLogin() + "\n");
out.write("addressbook=" + user.getAddressbookLocation() + "\n");
out.write("showimages=" + user.getShowImages() + "\n");
out.write("showexpanded=" + user.getShowExpanded() + "\n");
StringBuffer buf = new StringBuffer();
buf.append("groups=");
Map groups = user.getBlogGroups();
for (Iterator iter = groups.keySet().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
List selectors = (List)groups.get(name);
buf.append(name).append(':');
for (int i = 0; i < selectors.size(); i++) {
buf.append(selectors.get(i));
if (i + 1 < selectors.size())
buf.append(",");
}
if (iter.hasNext())
buf.append(' ');
}
buf.append('\n');
out.write(buf.toString());
// shitlist=hash,hash,hash
List shitlistedBlogs = user.getShitlistedBlogs();
if (shitlistedBlogs.size() > 0) {
buf.setLength(0);
buf.append("shitlistedblogs=");
for (int i = 0; i < shitlistedBlogs.size(); i++) {
Hash blog = (Hash)shitlistedBlogs.get(i);
buf.append(blog.toBase64());
if (i + 1 < shitlistedBlogs.size())
buf.append(',');
}
buf.append('\n');
out.write(buf.toString());
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe){}
}
}
public String register(User user, String login, String password, String registrationPassword, String blogName, String blogDescription, String contactURL) {
String hashedRegistrationPassword = getRegistrationPassword();
if (hashedRegistrationPassword != null) {
if (!hashedRegistrationPassword.equals(Base64.encode(_context.sha().calculateHash(registrationPassword.getBytes()).getData())))
return "Invalid registration password";
}
String userHash = Base64.encode(_context.sha().calculateHash(login.getBytes()).getData());
File userFile = new File(_userDir, userHash);
if (userFile.exists()) {
return "Cannot register the login " + login + ": it already exists";
} else {
BlogInfo info = createBlog(blogName, blogDescription, contactURL, null);
String hashedPassword = Base64.encode(_context.sha().calculateHash(password.getBytes()).getData());
FileWriter out = null;
try {
out = new FileWriter(userFile);
out.write("password=" + hashedPassword + "\n");
out.write("blog=" + Base64.encode(info.getKey().calculateHash().getData()) + "\n");
out.write("lastid=-1\n");
out.write("lastmetaedition=0\n");
out.write("addressbook=userhosts-"+userHash + ".txt\n");
out.write("showimages=false\n");
out.write("showexpanded=false\n");
} catch (IOException ioe) {
ioe.printStackTrace();
return "Internal error registering - " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
String loginResult = login(user, login, password);
_archive.regenerateIndex();
return loginResult;
}
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml) {
return createBlogEntry(user, subject, tags, entryHeaders, sml, null, null, null);
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml, List fileNames, List fileStreams, List fileTypes) {
if (!user.getAuthenticated()) return null;
BlogInfo info = getArchive().getBlogInfo(user.getBlog());
if (info == null) return null;
SigningPrivateKey privkey = getMyPrivateKey(info);
if (privkey == null) return null;
long entryId = -1;
long now = _context.clock().now();
long dayBegin = getDayBegin(now);
if (user.getMostRecentEntry() >= dayBegin)
entryId = user.getMostRecentEntry() + 1;
else
entryId = dayBegin;
StringTokenizer tok = new StringTokenizer(tags, " ,\n\t");
String tagList[] = new String[tok.countTokens()];
for (int i = 0; i < tagList.length; i++)
tagList[i] = tok.nextToken().trim();
BlogURI uri = new BlogURI(user.getBlog(), entryId);
try {
StringBuffer raw = new StringBuffer(sml.length() + 128);
raw.append("Subject: ").append(subject).append('\n');
raw.append("Tags: ");
for (int i = 0; i < tagList.length; i++)
raw.append(tagList[i]).append('\t');
raw.append('\n');
if ( (entryHeaders != null) && (entryHeaders.trim().length() > 0) ) {
System.out.println("Entry headers: " + entryHeaders);
BufferedReader userHeaders = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(entryHeaders.getBytes())));
String line = null;
while ( (line = userHeaders.readLine()) != null) {
line = line.trim();
System.out.println("Line: " + line);
if (line.length() <= 0) continue;
int split = line.indexOf('=');
int split2 = line.indexOf(':');
if ( (split < 0) || ( (split2 > 0) && (split2 < split) ) ) split = split2;
String key = line.substring(0,split).trim();
String val = line.substring(split+1).trim();
raw.append(key).append(": ").append(val).append('\n');
}
}
raw.append('\n');
raw.append(sml);
EntryContainer c = new EntryContainer(uri, tagList, raw.toString().getBytes());
if ((fileNames != null) && (fileStreams != null) && (fileNames.size() == fileStreams.size()) ) {
for (int i = 0; i < fileNames.size(); i++) {
String name = (String)fileNames.get(i);
InputStream in = (InputStream)fileStreams.get(i);
String fileType = (fileTypes != null ? (String)fileTypes.get(i) : "application/octet-stream");
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
while (true) {
int read = in.read(buf);
if (read == -1) break;
baos.write(buf, 0, read);
}
byte att[] = baos.toByteArray();
if ( (att != null) && (att.length > 0) )
c.addAttachment(att, new File(name).getName(), null, fileType);
}
}
//for (int i = 7; i < args.length; i++) {
// c.addAttachment(read(args[i]), new File(args[i]).getName(),
// "Attached file", "application/octet-stream");
//}
SessionKey entryKey = null;
//if (!"NONE".equals(args[5]))
// entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(_context, privkey, null);
boolean ok = getArchive().storeEntry(c);
if (ok) {
getArchive().regenerateIndex();
user.setMostRecentEntry(entryId);
saveUser(user);
return uri;
} else {
return null;
}
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
}
public String addAddress(User user, String name, String location, String schema) {
if (!user.getAuthenticated()) return "Not logged in";
boolean ok = validateAddressName(name);
if (!ok) return "Invalid name: " + HTMLRenderer.sanitizeString(name);
ok = validateAddressLocation(location);
if (!ok) return "Invalid location: " + HTMLRenderer.sanitizeString(location);
if (!validateAddressSchema(schema)) return "Unsupported schema: " + HTMLRenderer.sanitizeString(schema);
// no need to quote user/location further, as they've been sanitized
FileWriter out = null;
try {
File userHostsFile = new File(user.getAddressbookLocation());
Properties knownHosts = getKnownHosts(user, true);
if (knownHosts.containsKey(name)) return "Name is already in use";
out = new FileWriter(userHostsFile, true);
out.write(name + "=" + location + '\n');
return "Address " + name + " written to your hosts file (" + userHostsFile.getName() + ")";
} catch (IOException ioe) {
return "Error writing out host entry: " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public Properties getKnownHosts(User user, boolean includePublic) throws IOException {
Properties rv = new Properties();
if ( (user != null) && (user.getAuthenticated()) ) {
File userHostsFile = new File(user.getAddressbookLocation());
rv.putAll(getKnownHosts(userHostsFile));
}
if (includePublic) {
rv.putAll(getKnownHosts(new File("hosts.txt")));
}
return rv;
}
private Properties getKnownHosts(File filename) throws IOException {
Properties rv = new Properties();
if (filename.exists()) {
rv.load(new FileInputStream(filename));
}
return rv;
}
private boolean validateAddressName(String name) {
if ( (name == null) || (name.trim().length() <= 0) || (!name.endsWith(".i2p")) ) return false;
for (int i = 0; i < name.length(); i++) {
char c = name.charAt(i);
if (!Character.isLetterOrDigit(c) && ('.' != c) && ('-' != c) && ('_' != c) )
return false;
}
return true;
}
private boolean validateAddressLocation(String location) {
if ( (location == null) || (location.trim().length() <= 0) ) return false;
try {
Destination d = new Destination(location);
return (d.getPublicKey() != null);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return false;
}
}
private boolean validateAddressSchema(String schema) {
if ( (schema == null) || (schema.trim().length() <= 0) ) return false;
return "eep".equals(schema) || "i2p".equals(schema);
}
private final GregorianCalendar _cal = new GregorianCalendar();
private long getDayBegin(long now) {
synchronized (_cal) {
_cal.setTimeInMillis(now);
_cal.set(Calendar.MILLISECOND, 0);
_cal.set(Calendar.SECOND, 0);
_cal.set(Calendar.MINUTE, 0);
_cal.set(Calendar.HOUR, 0);
_cal.set(Calendar.HOUR_OF_DAY, 0);
return _cal.getTimeInMillis();
}
}
}
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*/
public class CLI {
public static final String USAGE = "Usage: \n" +
"rootDir regenerateIndex\n" +
"rootDir createBlog name description contactURL[ archiveURL]*\n" +
"rootDir createEntry blogPublicKeyHash tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
"rootDir listMyBlogs\n" +
"rootDir listTags blogPublicKeyHash\n" +
"rootDir listEntries blogPublicKeyHash blogTag\n" +
"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
public static void main(String args[]) {
//args = new String[] { "~/.syndie/", "listEntries", "9qXCJUyUBCCaiIShURo02ckxjrMvrtiDYENv2ATL3-Y=", "/" };
//args = new String[] { "~/.syndie/", "renderEntry", "Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=", "/", "20050811001", "NONE", "true", "false" };
if (args.length < 2) {
System.err.print(USAGE);
return;
}
String command = args[1];
if ("createBlog".equals(command))
createBlog(args);
else if ("listMyBlogs".equals(command))
listMyBlogs(args);
else if ("createEntry".equals(command))
createEntry(args);
else if ("listTags".equals(command))
listPaths(args);
else if ("listEntries".equals(command))
listEntries(args);
else if ("regenerateIndex".equals(command))
regenerateIndex(args);
else if ("renderEntry".equals(command))
renderEntry(args);
else
System.out.print(USAGE);
}
private static void createBlog(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
String archives[] = new String[args.length - 5];
System.arraycopy(args, 5, archives, 0, archives.length);
BlogInfo info = mgr.createBlog(args[2], args[3], args[4], archives);
System.out.println("Blog created: " + info);
mgr.getArchive().regenerateIndex();
}
private static void listMyBlogs(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List info = mgr.listMyBlogs();
for (int i = 0; i < info.size(); i++)
System.out.println(info.get(i).toString());
}
private static void listPaths(String args[]) {
// "rootDir listTags blogPublicKeyHash\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List tags = mgr.getArchive().listTags(new Hash(Base64.decode(args[2])));
System.out.println("tag count: " + tags.size());
for (int i = 0; i < tags.size(); i++)
System.out.println("Tag " + i + ": " + tags.get(i).toString());
}
private static void regenerateIndex(String args[]) {
// "rootDir regenerateIndex\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
mgr.getArchive().regenerateIndex();
System.out.println("Index regenerated");
}
private static void listEntries(String args[]) {
// "rootDir listEntries blogPublicKeyHash tag\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List entries = mgr.getArchive().listEntries(new Hash(Base64.decode(args[2])), -1, args[3], null);
System.out.println("Entry count: " + entries.size());
for (int i = 0; i < entries.size(); i++) {
EntryContainer entry = (EntryContainer)entries.get(i);
System.out.println("***************************************************");
System.out.println("Entry " + i + ": " + entry.getURI().toString());
System.out.println("===================================================");
System.out.println(entry.getEntry().getText());
System.out.println("===================================================");
Attachment attachments[] = entry.getAttachments();
for (int j = 0; j < attachments.length; j++) {
System.out.println("Attachment " + j + ": " + attachments[j]);
}
System.out.println("===================================================");
}
}
private static void renderEntry(String args[]) {
//"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
long id = -1;
try {
id = Long.parseLong(args[3]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
SessionKey entryKey = null;
if (!("NONE".equals(args[4])))
entryKey = new SessionKey(Base64.decode(args[5]));
EntryContainer entry = mgr.getArchive().getEntry(new BlogURI(new Hash(Base64.decode(args[2])), id), entryKey);
if (entry != null) {
HTMLRenderer renderer = new HTMLRenderer();
boolean summaryOnly = "true".equalsIgnoreCase(args[5]);
boolean showImages = "true".equalsIgnoreCase(args[6]);
try {
File f = File.createTempFile("syndie", ".html");
Writer out = new FileWriter(f);
renderer.render(null, mgr.getArchive(), entry, out, summaryOnly, showImages);
out.flush();
out.close();
System.out.println("Rendered to " + f.getAbsolutePath() + ": " + f.length());
} catch (IOException ioe) {
ioe.printStackTrace();
}
} else {
System.err.println("Entry does not exist");
}
}
private static void createEntry(String args[]) {
// "rootDir createEntry blogPublicKey tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
I2PAppContext ctx = I2PAppContext.getGlobalContext();
BlogManager mgr = new BlogManager(ctx, args[0]);
long entryId = -1;
if ("NOW".equals(args[4])) {
entryId = ctx.clock().now();
} else {
try {
entryId = Long.parseLong(args[4]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
}
StringTokenizer tok = new StringTokenizer(args[3], ",");
String tags[] = new String[tok.countTokens()];
for (int i = 0; i < tags.length; i++)
tags[i] = tok.nextToken();
BlogURI uri = new BlogURI(new Hash(Base64.decode(args[2])), entryId);
BlogInfo blog = mgr.getArchive().getBlogInfo(uri);
if (blog == null) {
System.err.println("Blog does not exist: " + uri);
return;
}
SigningPrivateKey key = mgr.getMyPrivateKey(blog);
try {
byte smlData[] = read(args[6]);
EntryContainer c = new EntryContainer(uri, tags, smlData);
for (int i = 7; i < args.length; i++) {
c.addAttachment(read(args[i]), new File(args[i]).getName(),
"Attached file", "application/octet-stream");
}
SessionKey entryKey = null;
if (!"NONE".equals(args[5]))
entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(ctx, key, entryKey);
boolean ok = mgr.getArchive().storeEntry(c);
System.out.println("Blog entry created: " + c+ "? " + ok);
if (ok)
mgr.getArchive().regenerateIndex();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
private static final byte[] read(String file) throws IOException {
File f = new File(file);
FileInputStream in = new FileInputStream(f);
byte rv[] = new byte[(int)f.length()];
if (rv.length != DataHelper.read(in, rv))
throw new IOException("File not read completely");
return rv;
}
}
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Lazy loading wrapper for an entry, pulling data out of a cached & extracted dir,
* rather than dealing with the crypto, zip, etc.
*
*/
class CachedEntry extends EntryContainer {
private File _entryDir;
private int _format;
private int _size;
private BlogURI _blog;
private Properties _headers;
private Entry _entry;
private Attachment _attachments[];
public CachedEntry(File entryDir) {
_entryDir = entryDir;
importMeta();
_entry = new CachedEntryDetails();
_attachments = null;
}
// always available, loaded from meta
public int getFormat() { return _format; }
public BlogURI getURI() { return _blog; }
public int getCompleteSize() { return _size; }
// dont need to override it, as it works off getHeader
//public String[] getTags() { return super.getTags(); }
public Entry getEntry() { return _entry; }
public Attachment[] getAttachments() {
importAttachments();
return _attachments;
}
public String getHeader(String key) {
importHeaders();
return _headers.getProperty(key);
}
public String toString() { return getURI().toString(); }
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) { return true; }
// not supported...
public void parseRawData(I2PAppContext ctx) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void setHeader(String name, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
throw new IllegalStateException("Not supported on cached entries");
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public Signature getSignature() {
throw new IllegalStateException("Not supported on cached entries");
}
// now the actual lazy loading code
private void importMeta() {
Properties meta = readProps(new File(_entryDir, EntryExtractor.META));
_format = getInt(meta, "format");
_size = getInt(meta, "size");
_blog = new BlogURI(new Hash(Base64.decode(meta.getProperty("blog"))), getLong(meta, "entry"));
}
private Properties importHeaders() {
if (_headers == null)
_headers = readProps(new File(_entryDir, EntryExtractor.HEADERS));
return _headers;
}
private void importAttachments() {
if (_attachments == null) {
List attachments = new ArrayList();
int i = 0;
while (true) {
File meta = new File(_entryDir, EntryExtractor.ATTACHMENT_PREFIX + i + EntryExtractor.ATTACHMENT_META_SUFFIX);
if (meta.exists())
attachments.add(new CachedAttachment(i, meta));
else
break;
i++;
}
Attachment a[] = new Attachment[attachments.size()];
for (i = 0; i < a.length; i++)
a[i] = (Attachment)attachments.get(i);
_attachments = a;
}
return;
}
private static Properties readProps(File propsFile) {
Properties rv = new Properties();
BufferedReader in = null;
try {
in = new BufferedReader(new FileReader(propsFile));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if ( (split <= 0) || (split >= line.length()) ) continue;
rv.setProperty(line.substring(0, split).trim(), line.substring(split+1).trim());
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
return rv;
}
private static final int getInt(Properties props, String key) {
String val = props.getProperty(key);
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) {}
return -1;
}
private static final long getLong(Properties props, String key) {
String val = props.getProperty(key);
try { return Long.parseLong(val); } catch (NumberFormatException nfe) {}
return -1l;
}
private class CachedEntryDetails extends Entry {
private String _text;
public CachedEntryDetails() {
super(null);
}
public String getText() {
importText();
return _text;
}
private void importText() {
if (_text == null) {
InputStream in = null;
try {
File f = new File(_entryDir, EntryExtractor.ENTRY);
byte buf[] = new byte[(int)f.length()]; // hmm
in = new FileInputStream(f);
int read = DataHelper.read(in, buf);
if (read != buf.length) throw new IOException("read: " + read + " file size: " + buf.length + " for " + f.getPath());
_text = new String(buf);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
}
}
}
private class CachedAttachment extends Attachment {
private int _attachmentId;
private File _metaFile;
private Properties _attachmentHeaders;
private int _dataSize;
public CachedAttachment(int id, File meta) {
super(null, null);
_attachmentId = id;
_metaFile = meta;
_attachmentHeaders = null;
}
public int getDataLength() {
importAttachmentHeaders();
return _dataSize;
}
public byte[] getData() {
throw new IllegalStateException("Not supported on cached entries");
}
public InputStream getDataStream() throws IOException {
String name = EntryExtractor.ATTACHMENT_PREFIX + _attachmentId + EntryExtractor.ATTACHMENT_DATA_SUFFIX;
File f = new File(_entryDir, name);
return new FileInputStream(f);
}
public byte[] getRawMetadata() {
throw new IllegalStateException("Not supported on cached entries");
}
public String getMeta(String key) {
importAttachmentHeaders();
return _attachmentHeaders.getProperty(key);
}
//public String getName() { return getMeta(NAME); }
//public String getDescription() { return getMeta(DESCRIPTION); }
//public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public Map getMeta() {
importAttachmentHeaders();
return _attachmentHeaders;
}
public String toString() {
importAttachmentHeaders();
int len = _dataSize;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
private void importAttachmentHeaders() {
if (_attachmentHeaders == null) {
Properties props = readProps(_metaFile);
String sz = (String)props.remove(EntryExtractor.ATTACHMENT_DATA_SIZE);
if (sz != null) {
try {
_dataSize = Integer.parseInt(sz);
} catch (NumberFormatException nfe) {}
}
_attachmentHeaders = props;
}
}
}
}
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.I2PAppContext;
/**
* To cut down on unnecessary IO/cpu load, extract entries onto the disk for
* faster access later. Individual entries are stored in subdirectories based on
* their name - $archiveDir/$blogDir/$entryId.snd extracts its files into various
* files under $cacheDir/$blogDir/$entryId/:
* headers.txt: name=value pairs for attributes of the entry container itself
* info.txt: name=value pairs for implicit attributes of the container (blog, id, format, size)
* entry.sml: raw sml file
* attachmentN_data.dat: raw binary data for attachment N
* attachmentN_meta.dat: name=value pairs for attributes of attachment N
*
*/
public class EntryExtractor {
private I2PAppContext _context;
static final String HEADERS = "headers.txt";
static final String META = "meta.txt";
static final String ENTRY = "entry.sml";
static final String ATTACHMENT_PREFIX = "attachment";
static final String ATTACHMENT_DATA_SUFFIX = "_data.dat";
static final String ATTACHMENT_META_SUFFIX = "_meta.txt";
static final String ATTACHMENT_DATA_SIZE = "EntryExtractor__dataSize";
public EntryExtractor(I2PAppContext context) {
_context = context;
}
public boolean extract(File entryFile, File entryDir, SessionKey entryKey, BlogInfo info) throws IOException {
EntryContainer entry = new EntryContainer();
entry.load(new FileInputStream(entryFile));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
return false;
} else {
entry.setCompleteSize((int)entryFile.length());
if (entryKey != null)
entry.parseRawData(_context, entryKey);
else
entry.parseRawData(_context);
extract(entry, entryDir);
return true;
}
}
public void extract(EntryContainer entry, File entryDir) throws IOException {
extractHeaders(entry, entryDir);
extractMeta(entry, entryDir);
extractEntry(entry, entryDir);
Attachment attachments[] = entry.getAttachments();
if (attachments != null) {
for (int i = 0; i < attachments.length; i++) {
extractAttachmentData(i, attachments[i], entryDir);
extractAttachmentMetadata(i, attachments[i], entryDir);
}
}
}
private void extractHeaders(EntryContainer entry, File entryDir) throws IOException {
FileWriter out = null;
try {
out = new FileWriter(new File(entryDir, HEADERS));
Map headers = entry.getHeaders();
for (Iterator iter = headers.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)headers.get(k);
out.write(k.trim() + '=' + v.trim() + '\n');
}
} finally {
out.close();
}
}
private void extractMeta(EntryContainer entry, File entryDir) throws IOException {
FileWriter out = null;
try {
out = new FileWriter(new File(entryDir, META));
out.write("format=" + entry.getFormat() + '\n');
out.write("size=" + entry.getCompleteSize() + '\n');
out.write("blog=" + entry.getURI().getKeyHash().toBase64() + '\n');
out.write("entry=" + entry.getURI().getEntryId() + '\n');
} finally {
out.close();
}
}
private void extractEntry(EntryContainer entry, File entryDir) throws IOException {
FileWriter out = null;
try {
out = new FileWriter(new File(entryDir, ENTRY));
out.write(entry.getEntry().getText());
} finally {
out.close();
}
}
private void extractAttachmentData(int num, Attachment attachment, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_DATA_SUFFIX));
//out.write(attachment.getData());
InputStream data = attachment.getDataStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = data.read(buf)) != -1)
out.write(buf, 0, read);
data.close();
} finally {
out.close();
}
}
private void extractAttachmentMetadata(int num, Attachment attachment, File entryDir) throws IOException {
FileWriter out = null;
try {
out = new FileWriter(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_META_SUFFIX));
Map meta = attachment.getMeta();
for (Iterator iter = meta.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)meta.get(k);
out.write(k + '=' + v + '\n');
}
out.write(ATTACHMENT_DATA_SIZE + '=' + attachment.getDataLength());
} finally {
out.close();
}
}
}
package net.i2p.syndie;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
/**
* User session state and preferences.
*
*/
public class User {
private I2PAppContext _context;
private String _username;
private String _hashedPassword;
private Hash _blog;
private long _mostRecentEntry;
/** Group name to List of blog selectors, where the selectors are of the form
* blog://$key, entry://$key/$entryId, blogtag://$key/$tag, tag://$tag
*/
private Map _blogGroups;
/** list of blogs (Hash) we never want to see entries from */
private List _shitlistedBlogs;
/** where our userhosts.txt is */
private String _addressbookLocation;
private boolean _showImagesByDefault;
private boolean _showExpandedByDefault;
private long _lastLogin;
private long _lastMetaEntry;
private boolean _authenticated;
public User() {
_context = I2PAppContext.getGlobalContext();
init();
}
private void init() {
_authenticated = false;
_username = null;
_hashedPassword = null;
_blog = null;
_mostRecentEntry = -1;
_blogGroups = new HashMap();
_shitlistedBlogs = new ArrayList();
_addressbookLocation = "userhosts.txt";
_showImagesByDefault = false;
_showExpandedByDefault = false;
_lastLogin = -1;
_lastMetaEntry = 0;
}
public boolean getAuthenticated() { return _authenticated; }
public String getUsername() { return _username; }
public Hash getBlog() { return _blog; }
public String getBlogStr() { return Base64.encode(_blog.getData()); }
public long getMostRecentEntry() { return _mostRecentEntry; }
public Map getBlogGroups() { return _blogGroups; }
public List getShitlistedBlogs() { return _shitlistedBlogs; }
public String getAddressbookLocation() { return _addressbookLocation; }
public boolean getShowImages() { return _showImagesByDefault; }
public boolean getShowExpanded() { return _showExpandedByDefault; }
public long getLastLogin() { return _lastLogin; }
public String getHashedPassword() { return _hashedPassword; }
public long getLastMetaEntry() { return _lastMetaEntry; }
public void setMostRecentEntry(long id) { _mostRecentEntry = id; }
public void setLastMetaEntry(long id) { _lastMetaEntry = id; }
public void invalidate() {
BlogManager.instance().saveUser(this);
init();
}
public String login(String login, String pass, Properties props) {
String expectedPass = props.getProperty("password");
String hpass = Base64.encode(_context.sha().calculateHash(pass.getBytes()).getData());
if (!hpass.equals(expectedPass)) {
_authenticated = false;
return "Incorrect password";
}
_username = login;
_hashedPassword = expectedPass;
// blog=luS9d3uaf....HwAE=
String b = props.getProperty("blog");
if (b != null) _blog = new Hash(Base64.decode(b));
// lastid=12345
String id = props.getProperty("lastid");
if (id != null) try { _mostRecentEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// lastmetaedition=12345
id = props.getProperty("lastmetaedition");
if (id != null) try { _lastMetaEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// groups=abc:selector,selector,selector,selector def:selector,selector,selector
StringTokenizer tok = new StringTokenizer(props.getProperty("groups", ""), " ");
while (tok.hasMoreTokens()) {
String group = tok.nextToken();
int endName = group.indexOf(':');
if (endName <= 0)
continue;
String groupName = group.substring(0, endName);
String sel = group.substring(endName+1);
List selectors = new ArrayList();
while ( (sel != null) && (sel.length() > 0) ) {
int end = sel.indexOf(',');
if (end < 0) {
selectors.add(sel);
sel = null;
} else {
if (end + 1 >= sel.length()) {
selectors.add(sel.substring(0,end));
sel = null;
} else if (end == 0) {
sel = sel.substring(1);
} else {
selectors.add(sel.substring(0, end));
sel = sel.substring(end+1);
}
}
}
_blogGroups.put(groupName.trim(), selectors);
}
// shitlist=hash,hash,hash
tok = new StringTokenizer(props.getProperty("shitlistedblogs", ""), ",");
while (tok.hasMoreTokens()) {
String blog = tok.nextToken();
byte bl[] = Base64.decode(blog);
if ( (bl != null) && (bl.length == Hash.HASH_LENGTH) )
_shitlistedBlogs.add(new Hash(bl));
}
String addr = props.getProperty("addressbook", "userhosts.txt");
if (addr != null)
_addressbookLocation = addr;
String show = props.getProperty("showimages", "false");
_showImagesByDefault = (show != null) && (show.equals("true"));
show = props.getProperty("showexpanded", "false");
_showExpandedByDefault = (show != null) && (show.equals("true"));
_lastLogin = _context.clock().now();
_authenticated = true;
return LOGIN_OK;
}
public static final String LOGIN_OK = "Logged in";
}
package net.i2p.syndie;
/**
*
*/
public class Version {
public static final String VERSION = "0-alpha";
public static final String BUILD = "0";
public static final String INDEX_VERSION = "1.0";
public static final String ID = "$Id$";
}
package net.i2p.syndie.data;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
import net.i2p.syndie.BlogManager;
/**
* Simple read-only summary of an archive
*/
public class ArchiveIndex {
protected String _version;
protected long _generatedOn;
protected int _allBlogs;
protected int _newBlogs;
protected int _allEntries;
protected int _newEntries;
protected long _totalSize;
protected long _newSize;
/** list of BlogSummary objects */
protected List _blogs;
/** list of Hash objects */
protected List _newestBlogs;
/** list of BlogURI objects */
protected List _newestEntries;
protected Properties _headers;
public ArchiveIndex() {
this(false); //true);
}
public ArchiveIndex(boolean shouldLoad) {
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
_generatedOn = -1;
if (shouldLoad)
setIsLocal("true");
}
public String getVersion() { return _version; }
public Properties getHeaders() { return _headers; }
public int getAllBlogs() { return _allBlogs; }
public int getNewBlogs() { return _newBlogs; }
public int getAllEntries() { return _allEntries; }
public int getNewEntries() { return _newEntries; }
public long getTotalSize() { return _totalSize; }
public long getNewSize() { return _newSize; }
public long getGeneratedOn() { return _generatedOn; }
public String getNewSizeStr() {
if (_newSize < 1024) return _newSize + "";
if (_newSize < 1024*1024) return _newSize/1024 + "KB";
else return _newSize/(1024*1024) + "MB";
}
public String getTotalSizeStr() {
if (_totalSize < 1024) return _totalSize + "";
if (_totalSize < 1024*1024) return _totalSize/1024 + "KB";
else return _totalSize/(1024*1024) + "MB";
}
/** how many blogs/tags are indexed */
public int getIndexBlogs() { return _blogs.size(); }
/** get the blog used for the given blog/tag pair */
public Hash getBlog(int index) { return ((BlogSummary)_blogs.get(index)).blog; }
/** get the tag used for the given blog/tag pair */
public String getBlogTag(int index) { return ((BlogSummary)_blogs.get(index)).tag; }
/** get the highest entry ID for the given blog/tag pair */
public long getBlogLastUpdated(int index) { return ((BlogSummary)_blogs.get(index)).lastUpdated; }
/** get the entry count for the given blog/tag pair */
public int getBlogEntryCount(int index) { return ((BlogSummary)_blogs.get(index)).entries.size(); }
/** get the entry from the given blog/tag pair */
public BlogURI getBlogEntry(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).entry; }
/** get the raw entry size (including attachments) from the given blog/tag pair */
public long getBlogEntrySizeKB(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).size; }
/** how many 'new' blogs are listed */
public int getNewestBlogCount() { return _newestBlogs.size(); }
public Hash getNewestBlog(int index) { return (Hash)_newestBlogs.get(index); }
/** how many 'new' entries are listed */
public int getNewestBlogEntryCount() { return _newestEntries.size(); }
public BlogURI getNewestBlogEntry(int index) { return (BlogURI)_newestEntries.get(index); }
/** list of locally known tags (String) under the given blog */
public List getBlogTags(Hash blog) {
List rv = new ArrayList();
for (int i = 0; i < _blogs.size(); i++) {
if (getBlog(i).equals(blog))
rv.add(getBlogTag(i));
}
return rv;
}
/** list of unique blogs locally known (set of Hash) */
public Set getUniqueBlogs() {
Set rv = new HashSet();
for (int i = 0; i < _blogs.size(); i++)
rv.add(getBlog(i));
return rv;
}
public void setLocation(String location) {
try {
File l = new File(location);
if (l.exists())
load(l);
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public void setIsLocal(String val) {
if ("true".equals(val)) {
try {
File dir = BlogManager.instance().getArchive().getArchiveDir();
load(new File(dir, Archive.INDEX_FILE));
} catch (IOException ioe) {}
}
}
public void load(File location) throws IOException {
FileInputStream in = null;
try {
in = new FileInputStream(location);
load(in);
} finally {
if (in != null)
try { in.close(); } catch (IOException ioe) {}
}
}
/** load up the index from an archive.txt */
public void load(InputStream index) throws IOException {
_allBlogs = 0;
_allEntries = 0;
_newBlogs = 0;
_newEntries = 0;
_newSize = 0;
_totalSize = 0;
_version = null;
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
BufferedReader in = new BufferedReader(new InputStreamReader(index));
String line = null;
line = in.readLine();
if (line == null)
return;
if (!line.startsWith("SyndieVersion:"))
throw new IOException("Index is invalid - it starts with " + line);
_version = line.substring("SyndieVersion:".length()).trim();
if (!_version.startsWith("1."))
throw new IOException("Index is not supported, we only handle versions 1.*, but it is " + _version);
while ( (line = in.readLine()) != null) {
if (line.length() <= 0)
break;
if (line.startsWith("Blog:")) break;
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
_headers.setProperty(line.substring(0, split), line.substring(split+1));
}
if (line != null) {
do {
if (!line.startsWith("Blog:"))
break;
loadBlog(line);
} while ( (line = in.readLine()) != null);
}
// ignore the first line that doesnt start with blog - its blank
while ( (line = in.readLine()) != null) {
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
if (key.equals("AllBlogs"))
_allBlogs = getInt(val);
else if (key.equals("NewBlogs"))
_newBlogs = getInt(val);
else if (key.equals("AllEntries"))
_allEntries = getInt(val);
else if (key.equals("NewEntries"))
_newEntries = getInt(val);
else if (key.equals("TotalSize"))
_totalSize = getInt(val);
else if (key.equals("NewSize"))
_newSize = getInt(val);
else if (key.equals("NewestBlogs"))
_newestBlogs = parseNewestBlogs(val);
else if (key.equals("NewestEntries"))
_newestEntries = parseNewestEntries(val);
else
System.err.println("Key: " + key + " val: " + val);
}
}
/**
* Dig through the index for BlogURIs matching the given criteria, ordering the results by
* their own entryIds.
*
* @param out where to store the matches
* @param blog if set, what blog key must the entries be under
* @param tag if set, what tag must the entry be in
*
*/
public void selectMatchesOrderByEntryId(List out, Hash blog, String tag) {
TreeMap ordered = new TreeMap();
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (blog != null) {
if (!blog.equals(summary.blog))
continue;
}
if (tag != null) {
if (!tag.equals(summary.tag)) {
System.out.println("Tag [" + summary.tag + "] does not match the requested [" + tag + "]");
continue;
}
}
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
ordered.put(new Long(0-entry.entry.getEntryId()), entry.entry);
}
}
for (Iterator iter = ordered.values().iterator(); iter.hasNext(); ) {
BlogURI entry = (BlogURI)iter.next();
if (!out.contains(entry))
out.add(entry);
}
}
private static final int getInt(String val) {
try {
return Integer.parseInt(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return 0;
}
}
private List parseNewestBlogs(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new Hash(Base64.decode(tok.nextToken())));
return rv;
}
private List parseNewestEntries(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new BlogURI(tok.nextToken()));
return rv;
}
private void loadBlog(String line) throws IOException {
// Blog: hash YYYYMMDD tag\t[ yyyymmdd_n_sizeKB]*
StringTokenizer tok = new StringTokenizer(line.trim(), " \n\t");
if (tok.countTokens() < 4)
return;
tok.nextToken();
Hash keyHash = new Hash(Base64.decode(tok.nextToken()));
long when = getIndexDate(tok.nextToken());
String tag = tok.nextToken();
BlogSummary summary = new BlogSummary();
summary.blog = keyHash;
summary.tag = tag.trim();
summary.lastUpdated = when;
summary.entries = new ArrayList();
while (tok.hasMoreTokens()) {
String entry = tok.nextToken();
long id = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
summary.entries.add(new EntrySummary(new BlogURI(keyHash, id), kb));
}
_blogs.add(summary);
}
private SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd");
private long getIndexDate(String yyyymmdd) {
synchronized (_dateFmt) {
try {
return _dateFmt.parse(yyyymmdd).getTime();
} catch (ParseException pe) {
return -1;
}
}
}
private String getIndexDate(long when) {
synchronized (_dateFmt) {
return _dateFmt.format(new Date(when));
}
}
protected class BlogSummary {
Hash blog;
String tag;
long lastUpdated;
/** list of EntrySummary objects */
List entries;
public BlogSummary() {
entries = new ArrayList();
}
}
protected class EntrySummary {
BlogURI entry;
long size;
public EntrySummary(BlogURI uri, long kb) {
size = kb;
entry = uri;
}
}
/** export the index into an archive.txt */
public String toString() {
StringBuffer rv = new StringBuffer(1024);
rv.append("SyndieVersion: ").append(_version).append('\n');
for (Iterator iter = _headers.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
String val = _headers.getProperty(key);
rv.append(key).append(": ").append(val).append('\n');
}
for (int i = 0; i < _blogs.size(); i++) {
rv.append("Blog: ");
Hash blog = getBlog(i);
String tag = getBlogTag(i);
rv.append(Base64.encode(blog.getData())).append(' ');
rv.append(getIndexDate(getBlogLastUpdated(i))).append(' ');
rv.append(tag).append('\t');
int entries = getBlogEntryCount(i);
for (int j = 0; j < entries; j++) {
BlogURI entry = getBlogEntry(i, j);
long kb = getBlogEntrySizeKB(i, j);
rv.append(Archive.getIndexName(entry.getEntryId(), (int)kb*1024)).append(' ');
}
rv.append('\n');
}
rv.append('\n');
rv.append("AllBlogs: ").append(_allBlogs).append('\n');
rv.append("NewBlogs: ").append(_newBlogs).append('\n');
rv.append("AllEntries: ").append(_allEntries).append('\n');
rv.append("NewEntries: ").append(_newEntries).append('\n');
rv.append("TotalSize: ").append(_totalSize).append('\n');
rv.append("NewSize: ").append(_newSize).append('\n');
rv.append("NewestBlogs: ");
for (int i = 0; i < _newestBlogs.size(); i++)
rv.append(((Hash)(_newestBlogs.get(i))).toBase64()).append(' ');
rv.append('\n');
rv.append("NewestEntries: ");
for (int i = 0; i < _newestEntries.size(); i++)
rv.append(((BlogURI)_newestEntries.get(i)).toString()).append(' ');
rv.append('\n');
return rv.toString();
}
/** Usage: ArchiveIndex archive.txt */
public static void main(String args[]) {
try {
ArchiveIndex i = new ArchiveIndex();
i.load(new File(args[0]));
System.out.println(i.toString());
} catch (IOException ioe) { ioe.printStackTrace(); }
}
}
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
/**
*
*/
public class Attachment {
private byte _data[];
private byte _rawMetadata[];
private List _keys;
private List _values;
public Attachment(byte data[], byte metadata[]) {
_data = data;
_rawMetadata = metadata;
_keys = new ArrayList();
_values = new ArrayList();
parseMeta();
}
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public static final String MIMETYPE = "MimeType";
public Attachment(byte data[], String name, String description, String mimeType) {
_data = data;
_keys = new ArrayList();
_values = new ArrayList();
_keys.add(NAME);
_values.add(name);
if ( (description != null) && (description.trim().length() > 0) ) {
_keys.add(DESCRIPTION);
_values.add(description);
}
if ( (mimeType != null) && (mimeType.trim().length() > 0) ) {
_keys.add(MIMETYPE);
_values.add(mimeType);
}
createMeta();
}
public byte[] getData() { return _data; }
public int getDataLength() { return _data.length; }
public byte[] getRawMetadata() { return _rawMetadata; }
public InputStream getDataStream() throws IOException { return new ByteArrayInputStream(_data); }
public String getMeta(String key) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i)))
return (String)_values.get(i);
}
return null;
}
public String getName() { return getMeta(NAME); }
public String getDescription() { return getMeta(DESCRIPTION); }
public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i))) {
_values.set(i, val);
return;
}
}
_keys.add(key);
_values.add(val);
}
public Map getMeta() {
Map rv = new HashMap(_keys.size());
for (int i = 0; i < _keys.size(); i++) {
String k = (String)_keys.get(i);
String v = (String)_values.get(i);
rv.put(k,v);
}
return rv;
}
private void createMeta() {
StringBuffer meta = new StringBuffer(64);
for (int i = 0; i < _keys.size(); i++) {
meta.append(_keys.get(i)).append(':').append(_values.get(i)).append('\n');
}
_rawMetadata = meta.toString().getBytes();
}
private void parseMeta() {
if (_rawMetadata == null) return;
String key = null;
String val = null;
int keyBegin = 0;
int valBegin = -1;
for (int i = 0; i < _rawMetadata.length; i++) {
if (_rawMetadata[i] == ':') {
key = new String(_rawMetadata, keyBegin, i - keyBegin);
valBegin = i + 1;
} else if (_rawMetadata[i] == '\n') {
val = new String(_rawMetadata, valBegin, i - valBegin);
_keys.add(key);
_values.add(val);
keyBegin = i + 1;
key = null;
val = null;
}
}
}
public String toString() {
int len = 0;
if (_data != null)
len = _data.length;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
}
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Blog metadata. Formatted as: <pre>
* [key:val\n]*
* </pre>
*
* Required keys:
* Owner: base64 of their signing public key
* Signature: base64 of the DSA signature of the rest of the ordered metadata
*
* Optional keys:
* Posters: comma delimited list of base64 signing public keys that
* can post to the blog
* Name: name of the blog
* Description: brief description of the blog
*
*/
public class BlogInfo {
private SigningPublicKey _key;
private SigningPublicKey _posters[];
private String _optionNames[];
private String _optionValues[];
private Signature _signature;
public BlogInfo() {}
public BlogInfo(SigningPublicKey key, SigningPublicKey posters[], Properties opts) {
_optionNames = new String[0];
_optionValues = new String[0];
setKey(key);
setPosters(posters);
for (Iterator iter = opts.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = opts.getProperty(k);
setProperty(k.trim(), v.trim());
}
}
public SigningPublicKey getKey() { return _key; }
public void setKey(SigningPublicKey key) {
_key = key;
setProperty(OWNER_KEY, Base64.encode(key.getData()));
}
public static final String OWNER_KEY = "Owner";
public static final String POSTERS = "Posters";
public static final String SIGNATURE = "Signature";
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public void load(InputStream in) throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
List names = new ArrayList();
List vals = new ArrayList();
String line = null;
while ( (line = reader.readLine()) != null) {
line = line.trim();
int len = line.length();
int split = line.indexOf(':');
if ( (len <= 0) || (split <= 0) || (split >= len - 2) )
continue;
String key = line.substring(0, split).trim();
String val = line.substring(split+1).trim();
names.add(key);
vals.add(val);
}
_optionNames = new String[names.size()];
_optionValues = new String[names.size()];
for (int i = 0; i < _optionNames.length; i++) {
_optionNames[i] = (String)names.get(i);
_optionValues[i] = (String)vals.get(i);
}
String keyStr = getProperty(OWNER_KEY);
if (keyStr == null) throw new IOException("Owner not found");
_key = new SigningPublicKey(Base64.decode(keyStr));
String postersStr = getProperty(POSTERS);
if (postersStr != null) {
StringTokenizer tok = new StringTokenizer(postersStr, ", \t");
_posters = new SigningPublicKey[tok.countTokens()];
for (int i = 0; tok.hasMoreTokens(); i++)
_posters[i] = new SigningPublicKey(Base64.decode(tok.nextToken()));
}
String sigStr = getProperty(SIGNATURE);
if (sigStr == null) throw new IOException("Signature not found");
_signature = new Signature(Base64.decode(sigStr));
}
public void write(OutputStream out) throws IOException { write(out, true); }
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
for (int i = 0; i < _optionNames.length; i++) {
if ( (includeRealSignature) || (!SIGNATURE.equals(_optionNames[i])) )
buf.append(_optionNames[i]).append(':').append(_optionValues[i]).append('\n');
}
out.write(buf.toString().getBytes());
}
public String getProperty(String name) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name))
return _optionValues[i];
}
return null;
}
private void setProperty(String name, String val) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name)) {
_optionValues[i] = val;
return;
}
}
String names[] = new String[_optionNames.length + 1];
String values[] = new String[_optionValues.length + 1];
for (int i = 0; i < _optionNames.length; i++) {
names[i] = _optionNames[i];
values[i] = _optionValues[i];
}
names[names.length-1] = name;
values[values.length-1] = val;
_optionNames = names;
_optionValues = values;
}
public String[] getProperties() { return _optionNames; }
public SigningPublicKey[] getPosters() { return _posters; }
public void setPosters(SigningPublicKey posters[]) {
_posters = posters;
StringBuffer buf = new StringBuffer();
for (int i = 0; posters != null && i < posters.length; i++) {
buf.append(Base64.encode(posters[i].getData()));
if (i + 1 < posters.length)
buf.append(',');
}
setProperty(POSTERS, buf.toString());
}
public boolean verify(I2PAppContext ctx) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
return ctx.dsa().verifySignature(_signature, out.toByteArray(), _key);
} catch (IOException ioe) {
return false;
}
}
public void sign(I2PAppContext ctx, SigningPrivateKey priv) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
byte data[] = out.toByteArray();
Signature sig = ctx.dsa().sign(data, priv);
if (sig == null)
throw new IOException("wtf, why is the signature null? data.len = " + data.length + " priv: " + priv);
setProperty(SIGNATURE, Base64.encode(sig.getData()));
_signature = sig;
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public String toString() {
StringBuffer buf = new StringBuffer();
buf.append("Blog ").append(getKey().calculateHash().toBase64());
for (int i = 0; i < _optionNames.length; i++) {
if ( (!SIGNATURE.equals(_optionNames[i])) &&
(!OWNER_KEY.equals(_optionNames[i])) &&
(!SIGNATURE.equals(_optionNames[i])) )
buf.append(' ').append(_optionNames[i]).append(": ").append(_optionValues[i]);
}
if ( (_posters != null) && (_posters.length > 0) ) {
buf.append(" additional posts by");
for (int i = 0; i < _posters.length; i++) {
buf.append(' ').append(_posters[i].calculateHash().toBase64());
if (i + 1 < _posters.length)
buf.append(',');
}
}
return buf.toString();
}
}
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
/**
*
*/
public class BlogURI {
private Hash _blogHash;
private long _entryId;
public BlogURI() {
this(null, -1);
}
public BlogURI(Hash blogHash, long entryId) {
_blogHash = blogHash;
_entryId = entryId;
}
public BlogURI(String uri) {
if (uri.startsWith("blog://")) {
int off = "blog://".length();
_blogHash = new Hash(Base64.decode(uri.substring(off, off+44))); // 44 chars == base64(32 bytes)
int entryStart = uri.indexOf('/', off+1);
if (entryStart < 0) {
_entryId = -1;
} else {
try {
_entryId = Long.parseLong(uri.substring(entryStart+1).trim());
} catch (NumberFormatException nfe) {
_entryId = -1;
}
}
} else {
_blogHash = null;
_entryId = -1;
}
}
public Hash getKeyHash() { return _blogHash; }
public long getEntryId() { return _entryId; }
public void setKeyHash(Hash hash) { _blogHash = hash; }
public void setEntryId(long id) { _entryId = id; }
public String toString() {
if ( (_blogHash == null) || (_blogHash.getData() == null) )
return "";
StringBuffer rv = new StringBuffer(64);
rv.append("blog://").append(Base64.encode(_blogHash.getData()));
rv.append('/');
if (_entryId >= 0)
rv.append(_entryId);
return rv.toString();
}
public boolean equals(Object obj) {
if (obj == null) return false;
if (obj.getClass() != getClass()) return false;
return DataHelper.eq(_entryId, ((BlogURI)obj)._entryId) &&
DataHelper.eq(_blogHash, ((BlogURI)obj)._blogHash);
}
public int hashCode() {
return (int)_entryId;
}
public static void main(String args[]) {
test("http://asdf/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/123456789");
}
private static void test(String uri) {
BlogURI u = new BlogURI(uri);
if (!u.toString().equals(uri))
System.err.println("Not a match: [" + uri + "] != [" + u.toString() + "]");
}
}
package net.i2p.syndie.data;
/**
*
*/
public class Entry {
private String _text;
public Entry(String raw) {
_text = raw;
}
public String getText() { return _text; }
}
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Securely wrap up an entry and any attachments. Container format:<pre>
* $format\n
* [$key: $val\n]*
* \n
* Signature: $base64(DSA signature)\n
* Size: sizeof(data)\n
* [data bytes]
* </pre>
*
* Required keys:
* BlogKey: base64 of the SHA256 of the blog's public key
* BlogTags: tab delimited list of tags under which this entry should be organized
* BlogEntryId: base10 unique identifier of this entry within the key/path. Typically starts
* as the current day (in unix time, milliseconds) plus further milliseconds for
* each entry within the day.
*
* The data bytes contains zip file, either in the clear or encrypted. If the format
* is encrypted, the BlogPath key will (likely) be encrypted as well.
*
*/
public class EntryContainer {
private List _rawKeys;
private List _rawValues;
private Signature _signature;
private byte _rawData[];
private BlogURI _entryURI;
private int _format;
private Entry _entryData;
private Attachment _attachments[];
private int _completeSize;
public static final int FORMAT_ZIP_UNENCRYPTED = 0;
public static final int FORMAT_ZIP_ENCRYPTED = 1;
public static final String FORMAT_ZIP_UNENCRYPTED_STR = "syndie.entry.zip-unencrypted";
public static final String FORMAT_ZIP_ENCRYPTED_STR = "syndie.entry.zip-encrypted";
public static final String HEADER_BLOGKEY = "BlogKey";
public static final String HEADER_BLOGTAGS = "BlogTags";
public static final String HEADER_ENTRYID = "BlogEntryId";
public EntryContainer() {
_rawKeys = new ArrayList();
_rawValues = new ArrayList();
_completeSize = -1;
}
public EntryContainer(BlogURI uri, String tags[], byte smlData[]) {
this();
_entryURI = uri;
_entryData = new Entry(new String(smlData));
setHeader(HEADER_BLOGKEY, Base64.encode(uri.getKeyHash().getData()));
StringBuffer buf = new StringBuffer();
for (int i = 0; tags != null && i < tags.length; i++)
buf.append(tags[i]).append('\t');
setHeader(HEADER_BLOGTAGS, buf.toString());
if (uri.getEntryId() < 0)
uri.setEntryId(System.currentTimeMillis());
setHeader(HEADER_ENTRYID, Long.toString(uri.getEntryId()));
}
public int getFormat() { return _format; }
public void load(InputStream source) throws IOException {
String fmt = DataHelper.readLine(source).trim();
if (FORMAT_ZIP_UNENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_UNENCRYPTED;
} else if (FORMAT_ZIP_ENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_ENCRYPTED;
} else {
throw new IOException("Unsupported entry format: " + fmt);
}
String line = null;
while ( (line = DataHelper.readLine(source)) != null) {
line = line.trim();
int len = line.length();
if (len <= 0)
break;
int split = line.indexOf(':');
if ( (split <= 0) || (split >= len - 2) )
throw new IOException("Invalid format of the syndie entry: line=" + line);
String key = line.substring(0, split);
String val = line.substring(split+1);
_rawKeys.add(key);
_rawValues.add(val);
}
parseHeaders();
String sigStr = DataHelper.readLine(source);
sigStr = sigStr.substring("Signature:".length()+1).trim();
_signature = new Signature(Base64.decode(sigStr));
//System.out.println("Sig: " + _signature.toBase64());
line = DataHelper.readLine(source).trim();
int dataSize = -1;
try {
int index = line.indexOf("Size:");
if (index == 0)
dataSize = Integer.parseInt(line.substring("Size:".length()+1).trim());
} catch (NumberFormatException nfe) {
throw new IOException("Invalid entry size: " + line);
}
byte data[] = new byte[dataSize];
int read = DataHelper.read(source, data);
if (read != dataSize)
throw new IOException("Incomplete entry: read " + read + " expected " + dataSize);
_rawData = data;
}
public void seal(I2PAppContext ctx, SigningPrivateKey signingKey, SessionKey entryKey) throws IOException {
System.out.println("Sealing " + _entryURI);
if (entryKey == null)
_format = FORMAT_ZIP_UNENCRYPTED;
else
_format = FORMAT_ZIP_ENCRYPTED;
setHeader(HEADER_BLOGKEY, Base64.encode(_entryURI.getKeyHash().getData()));
if (_entryURI.getEntryId() < 0)
_entryURI.setEntryId(ctx.clock().now());
setHeader(HEADER_ENTRYID, Long.toString(_entryURI.getEntryId()));
_rawData = createRawData(ctx, entryKey);
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
write(baos, false);
byte data[] = baos.toByteArray();
_signature = ctx.dsa().sign(data, signingKey);
}
private byte[] createRawData(I2PAppContext ctx, SessionKey entryKey) throws IOException {
byte raw[] = createRawData();
if (entryKey != null) {
byte iv[] = new byte[16];
ctx.random().nextBytes(iv);
byte rv[] = new byte[raw.length + iv.length];
ctx.aes().encrypt(raw, 0, rv, iv.length, entryKey, iv, raw.length);
System.arraycopy(iv, 0, rv, 0, iv.length);
return rv;
} else {
return raw;
}
}
private byte[] createRawData() throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ZipOutputStream out = new ZipOutputStream(baos);
ZipEntry ze = new ZipEntry(ZIP_ENTRY);
byte data[] = _entryData.getText().getBytes();
ze.setTime(0);
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
for (int i = 0; (_attachments != null) && (i < _attachments.length); i++) {
ze = new ZipEntry(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
data = _attachments[i].getData();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
ze = new ZipEntry(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
data = _attachments[i].getRawMetadata();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
}
out.finish();
out.close();
return baos.toByteArray();
}
public static final String ZIP_ENTRY = "entry.sml";
public static final String ZIP_ATTACHMENT_PREFIX = "attachmentdata";
public static final String ZIP_ATTACHMENT_SUFFIX = ".szd";
public static final String ZIP_ATTACHMENT_META_PREFIX = "attachmentmeta";
public static final String ZIP_ATTACHMENT_META_SUFFIX = ".szm";
public void parseRawData(I2PAppContext ctx) throws IOException { parseRawData(ctx, null); }
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
int dataOffset = 0;
if (zipKey != null) {
byte iv[] = new byte[16];
System.arraycopy(_rawData, 0, iv, 0, iv.length);
ctx.aes().decrypt(_rawData, iv.length, _rawData, iv.length, zipKey, iv, _rawData.length - iv.length);
dataOffset = iv.length;
}
ByteArrayInputStream in = new ByteArrayInputStream(_rawData, dataOffset, _rawData.length - dataOffset);
ZipInputStream zi = new ZipInputStream(in);
Map attachments = new HashMap();
Map attachmentMeta = new HashMap();
while (true) {
ZipEntry entry = zi.getNextEntry();
if (entry == null)
break;
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = zi.read(buf)) != -1)
out.write(buf, 0, read);
byte entryData[] = out.toByteArray();
String name = entry.getName();
if (ZIP_ENTRY.equals(name)) {
_entryData = new Entry(new String(entryData));
} else if (name.startsWith(ZIP_ATTACHMENT_PREFIX)) {
attachments.put(name, (Object)entryData);
} else if (name.startsWith(ZIP_ATTACHMENT_META_PREFIX)) {
attachmentMeta.put(name, (Object)entryData);
}
//System.out.println("Read entry [" + name + "] with size=" + entryData.length);
}
_attachments = new Attachment[attachments.size()];
for (int i = 0; i < attachments.size(); i++) {
byte data[] = (byte[])attachments.get(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
byte metadata[] = (byte[])attachmentMeta.get(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
if ( (data != null) && (metadata != null) )
_attachments[i] = new Attachment(data, metadata);
else
System.out.println("Unable to get " + i + ": " + data + "/" + metadata);
}
//System.out.println("Attachments: " + _attachments.length + "/" + attachments.size() + ": " + attachments);
}
public BlogURI getURI() { return _entryURI; }
private static final String NO_TAGS[] = new String[0];
public String[] getTags() {
String tags = getHeader(HEADER_BLOGTAGS);
if ( (tags == null) || (tags.trim().length() <= 0) ) {
return NO_TAGS;
} else {
StringTokenizer tok = new StringTokenizer(tags, "\t");
String rv[] = new String[tok.countTokens()];
for (int i = 0; i < rv.length; i++)
rv[i] = tok.nextToken().trim();
return rv;
}
}
public Signature getSignature() { return _signature; }
public Entry getEntry() { return _entryData; }
public Attachment[] getAttachments() { return _attachments; }
public void setCompleteSize(int bytes) { _completeSize = bytes; }
public int getCompleteSize() { return _completeSize; }
public String getHeader(String key) {
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
if (k.equals(key))
return (String)_rawValues.get(i);
}
return null;
}
public Map getHeaders() {
Map rv = new HashMap(_rawKeys.size());
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
String v = (String)_rawValues.get(i);
rv.put(k,v);
}
return rv;
}
public void setHeader(String name, String val) {
int index = _rawKeys.indexOf(name);
if (index < 0) {
_rawKeys.add(name);
_rawValues.add(val);
} else {
_rawValues.set(index, val);
}
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
Attachment a = new Attachment(data, name, description, mimeType);
int old = (_attachments == null ? 0 : _attachments.length);
Attachment nv[] = new Attachment[old+1];
if (old > 0)
for (int i = 0; i < old; i++)
nv[i] = _attachments[i];
nv[old] = a;
_attachments = nv;
}
private void parseHeaders() throws IOException {
String keyHash = getHeader(HEADER_BLOGKEY);
String idVal = getHeader(HEADER_ENTRYID);
if (keyHash == null)
throw new IOException("Missing " + HEADER_BLOGKEY + " header");
long entryId = -1;
if ( (idVal != null) && (idVal.length() > 0) ) {
try {
entryId = Long.parseLong(idVal.trim());
} catch (NumberFormatException nfe) {
throw new IOException("Invalid format of entryId (" + idVal + ")");
}
}
_entryURI = new BlogURI(new Hash(Base64.decode(keyHash)), entryId);
}
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) {
if (_signature == null) throw new NullPointerException("sig is null");
if (info == null) throw new NullPointerException("info is null");
if (info.getKey() == null) throw new NullPointerException("info key is null");
if (info.getKey().getData() == null) throw new NullPointerException("info key data is null");
//System.out.println("Verifying " + _entryURI + " for " + info);
ByteArrayOutputStream out = new ByteArrayOutputStream(_rawData.length + 512);
try {
write(out, false);
byte dat[] = out.toByteArray();
//System.out.println("Raw data to verify: " + ctx.sha().calculateHash(dat).toBase64() + " sig: " + _signature.toBase64());
ByteArrayInputStream in = new ByteArrayInputStream(dat);
boolean ok = ctx.dsa().verifySignature(_signature, in, info.getKey());
if (!ok && info.getPosters() != null) {
for (int i = 0; !ok && i < info.getPosters().length; i++) {
in.reset();
ok = ctx.dsa().verifySignature(_signature, in, info.getPosters()[i]);
}
}
//System.out.println("Verified ok? " + ok + " key: " + info.getKey().calculateHash().toBase64());
//new Exception("verifying").printStackTrace();
return ok;
} catch (IOException ioe) {
//System.out.println("Verification failed! " + ioe.getMessage());
return false;
}
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
switch (_format) {
case FORMAT_ZIP_ENCRYPTED:
buf.append(FORMAT_ZIP_ENCRYPTED_STR).append('\n');
break;
case FORMAT_ZIP_UNENCRYPTED:
buf.append(FORMAT_ZIP_UNENCRYPTED_STR).append('\n');
break;
default:
throw new IOException("Invalid format " + _format);
}
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
buf.append(k.trim());
buf.append(": ");
buf.append(((String)_rawValues.get(i)).trim());
buf.append('\n');
}
buf.append('\n');
buf.append("Signature: ");
if (includeRealSignature)
buf.append(Base64.encode(_signature.getData()));
buf.append("\n");
buf.append("Size: ").append(_rawData.length).append('\n');
String str = buf.toString();
//System.out.println("Writing raw: \n[" + str + "] / " + I2PAppContext.getGlobalContext().sha().calculateHash(str.getBytes()) + ", raw data: " + I2PAppContext.getGlobalContext().sha().calculateHash(_rawData).toBase64() + "\n");
out.write(str.getBytes());
out.write(_rawData);
}
public String toString() { return _entryURI.toString(); }
}
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
/**
* writable archive index (most are readonly)
*/
public class LocalArchiveIndex extends ArchiveIndex {
public LocalArchiveIndex() {
super(false);
}
public void setGeneratedOn(long when) { _generatedOn = when; }
public void setVersion(String v) { _version = v; }
public void setHeaders(Properties headers) { _headers = headers; }
public void setHeader(String key, String val) { _headers.setProperty(key, val); }
public void setAllBlogs(int count) { _allBlogs = count; }
public void setNewBlogs(int count) { _newBlogs = count; }
public void setAllEntries(int count) { _allEntries = count; }
public void setNewEntries(int count) { _newEntries = count; }
public void setTotalSize(long bytes) { _totalSize = bytes; }
public void setNewSize(long bytes) { _newSize = bytes; }
public void addBlog(Hash key, String tag, long lastUpdated) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary s = (BlogSummary)_blogs.get(i);
if ( (s.blog.equals(key)) && (s.tag.equals(tag)) ) {
s.lastUpdated = Math.max(s.lastUpdated, lastUpdated);
return;
}
}
BlogSummary summary = new ArchiveIndex.BlogSummary();
summary.blog = key;
summary.tag = tag;
summary.lastUpdated = lastUpdated;
_blogs.add(summary);
}
public void addBlogEntry(Hash key, String tag, String entry) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(key) && (summary.tag.equals(tag)) ) {
long entryId = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
System.out.println("Adding entry " + entryId + ", size=" + kb + "KB [" + entry + "]");
EntrySummary entrySummary = new EntrySummary(new BlogURI(key, entryId), kb);
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary cur = (EntrySummary)summary.entries.get(j);
if (cur.entry.equals(entrySummary.entry))
return;
}
summary.entries.add(entrySummary);
return;
}
}
}
public void addNewestBlog(Hash key) {
if (!_newestBlogs.contains(key))
_newestBlogs.add(key);
}
public void addNewestEntry(BlogURI entry) {
if (!_newestEntries.contains(entry))
_newestEntries.add(entry);
}
}
package net.i2p.syndie.data;
/**
*
*/
public class SafeURL {
private String _schema;
private String _location;
private String _name;
private String _description;
public SafeURL(String raw) {
parse(raw);
}
private void parse(String raw) {
if (raw != null) {
int index = raw.indexOf("://");
if ( (index <= 0) || (index + 1 >= raw.length()) )
return;
_schema = raw.substring(0, index);
_location = raw.substring(index+3);
_location.replace('>', '_');
_location.replace('<', '^');
}
}
public String getSchema() { return _schema; }
public String getLocation() { return _location; }
public String toString() { return _schema + "://" + _location; }
}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment