A RSA signature has the length of the modulus: if using the usual 2048-bit key size, then signatures are 256-byte long. Since you want to fit the signature within a URL, which is text-based, you must have some sort of encoding such as Base64, which implies some size overhead; with Base64, 256 bytes will become 342 characters -- that makes for URL of quite respectable length.
DSA (and its highly fashionable elliptic-curve variant ECDSA) offers smaller signatures; you can get very decent security (as much as 2048-bit RSA) with signatures of size 60 bytes or so -- there again, Base64 encoding inflates that to 80 characters.
If you need even smaller signatures, then you must go to less mainstream algorithms, e.g. BLS, which offers signatures twice smaller than DSA. But the maths are fiendishly more complex, and there is no standard, only scientific papers.
Alternatively, you may want to use RSA with ISO 9796-2 signatures (not PKCS#1). You will get 256-byte signatures, but you may then smuggle some of your application data within the signature itself. As a rough approximation, you will get your n-bit security with a total overhead of about 3n bits; albeit with a minimal message size of 256 bytes.
Theoretical limit: if you want a security level of n bits (meaning that attackers are assumed not to be able to run computations which need more than 2n elementary operations), then the signature size cannot be less than n bits either. Indeed, an attacker could try brute force on the signature, trying out all possible signature values and using the signature verification algorithm (which uses Alice's public key, which is public, hence known to the attacker) until a matching signature is found.
However, there is no currently known secure digital signature algorithm which offers n-bit security with n-bit signatures. DSA needs to use 4n-bit signatures to offer n-bit security. BLS lowers that to 2n bits. There have been proposals for some other algorithm types which could reach down to 1.5n bits or so, but right now they all turned out to be flawed in some way.
It may be possible that you could change your model slightly. For instance, Alice and Bob may share some secret value; Bob then does not really trust "Alice" but rather "whoever knows the secret value, which happens to be known to both Alice and Bob". In that case, you can rely on a MAC algorithm instead of a digital signature; you can think of MAC algorithms as signatures where the key to sign and the key to verify are the same.
It is a change of context. Depending on who Alice and Bob are in your system, sharing a secret may or may not be possible. Using a MAC forfeits most chances at non-repudiation. It also means that Bob can himself produce URL that he will accept. Yet, if you can tolerate the use of a MAC, then you can get very short verification elements. In fact, you can even get below n bits because exhaustive search can no longer be applied: since the MAC verification key is secret, the attacker cannot try MAC values and decide whether they are correct on his own machines. To "try" a potential MAC value, the attacker MUST then send it to Bob, and see if Bob is happy with it or not. After a million tries or so, Bob may begin to suspect foul play and apply countermeasures (e.g. cease to respond to that decidedly dodgy requester).
In that sense, a MAC value of length 64 bits ought to be sufficient. That's just 8 bytes -- encodable with Base64 as 11 characters. HMAC is a widely implemented MAC algorithm with good repute.
There is no well-established standard for "signed URL". Every site designer works out his own scheme, which is unfortunate because homemade cryptography is one of the surest paths to disaster. The conceptual model, though, is sound: basically, you want Mallory to convey a message from Alice to Bob, such that Bob can know that the message really comes from Alice, and was not altered or even invented altogether by Mallory. That the message travels as a "URL" is a mere encoding constraint which has no impact on the concept.
A MAC or a signature is indeed the right tool here. Beware, though, of replay attacks: if Mallory has an Alice-signed URL that Bob will accept, Mallory may try to send that URL twice to Bob, to get twice the effect. Depending on your context, this may or may not be a problem. If it is, then Bob MUST enforce some protection mechanism, e.g. remembering past requests to reject duplicates. A time stamp within the URL data (under cover of the MAC or signature) can help Bob keep track of seen URL (i.e. Bob can forget "expired" URL since he won't accept them any more).