Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Scala has so many complex features

I think operator overloading and implicit conversions are the biggest problem.

You look at Scala code and first don't get anything because it's full of library specific symbols.

Then these implicit conversions mangle your objects behind your back and you don't even know what's happening at run time.

I like the ideas behind these features, but as a beginner, they always scared me away after looking into "production code" :\



The problem is not "production code", but library code. I write a lot of business code in Scala, and a large majority of it is very easy to read for the newcomers we have. What is absolutely terrifying is what happens when you look under the covers: If you look at the implementations of libraries, as a newbie, you will want to run away, because you need to understand many language features before you can even understand a thing.

This is why I ended up writing a series of articles on practical implicates. Implicits are pretty much unavoidable, but the scale's website documentation is appalling. But if they are explained through examples, and shows how, and when, it makes sense to use them, we can deal with them reasonably.


Sure, you can write code that works without understanding why, but if you want to be competent you need to understand how it all works.


Somebody recently made the acute observation that they love their own Scala code but that they always find other people's Scala code very difficult to read, and that it's probably the reason why Scala never imposed itself.


I've been writing and maintaining scala code for about 4-5 years, and I found reading other people's code in our team wasn't that bad. Then again, we do code reviews and such to make sure the code is readable. Reading library code on the other hand...


Note that unconventional symbols are very rare in the standard library. Third-party libraries, especially those made by the scaskell community, are where the problem lies. If one uses scalaz one gets what one deserves.

Edit: replaced "you" with "one" to clarify


I was going to mention Haskell as being similar. Lots of custom infix operators. On the one hand, once you know a library's custom operators it can make for very clean code, but it can be tough to read code for a library you don't know. I guess it's mitigated a bit by the fact that a lot of Haskell operators generalize well, so the same symbols tend to be used to do something similar, but it can be abused.


Never used scalaz.

I looked into stuff like Lift and Play


Lift is worse than Scalaz and barely maintained. I can't speak for Play; I use Spray for REST APIs (somewhat symbolic, but oh so worth it for the glorious routing DSL that combines the clarity of a config file with the refactorability and safety of code) and Wicket for HTML UIs.


Just an example of spray:

   get {
     path("foo" / IntNumber) { itemId =>
       {
          complete("You just looked for foo #" + itemId)
       }
   }
Or:

   path("foo") {
     parameter('id.as[Int] ?) { (maybeId: Option[Int]) =>
        {
        ...
        }
     }
   }
I can't say I have any objection to this bizarre use of symbology.


Someone asked recently on the mailing list if Lift is dead.

https://groups.google.com/forum/#!msg/liftweb/dA5NUjkqB-4


Link didn't work for me, but I suspect Lift has been dead for a while. It was fine for the time, but if you want a full framework Play is pretty much better in every way, and if you want a light framework you have Akka-HTTP and others..



Yes, it was probably 2009, that I looked into Lift when it was considered THE Scala web framework. But later Play seemed to have taken the place.


> You look at Scala code and first don't get anything because it's full of library specific symbols.

I've been using Scala for years and I've had minimal contact with libraries using symbols in any significant way (most libraries I commonly depend on use no operator overloading at all). The only ones which come to mind are some of the linked list operations and Map(foo -> bar, abc -> def) stuff.


Implicit conversions are now considered an anti-pattern and not recommended.


Is "Pimp my Library", as used to be recommended by Odersky, also an anti-pattern now? It uses implicit conversions, if I'm not mistaken.


No, Pimp my Library is not an anti-pattern. It even received dedicated syntax with `implicit class`es. Contrary to general implicit conversions, implicit classes are not dangerous because things are not actually converted as far as the developer can see (yes, under the hood, it's an implicit conversion, but it doesn't bite you). You're simply adding methods on existing classes, which is similar to extension methods in several other languages.


And in fact, if you combine implicit classes with values classes (e.g. implicit class RichFoo(val value: Foo) extends AnyVal { ... }) then you're effectively just adding extension methods, and there's not even an allocation involved.


I didn't even know implicit classes existed. Thanks!


It does, and yes I think it is now considered an anti-pattern. The proper way (IMO) to extend libraries is to use type-classes.


It would be more accurate to say that abuse of implicit conversions are now considered an anti-pattern. They are a core part of the language e.g. many of the extra functions Scala adds to java's String class.


> implicit def intToFloat(x: Int) = x.toFloat

I've seen code like this pop up way more than I'm comfortable admitting.


This is such an interesting choice of example because this particular implicit is baked into the language, and the discomfort you feel is a fine measure of how good an idea this is. Last I checked there was no way to turn it off.

  scala> def f(x: Float) = x.round
  f: (x: Float)Int

  scala> f(123456789)
  res0: Int = 123456792
But of course it doesn't stop at converting Ints into Floats.

  scala> f(Long.MaxValue)
  res1: Int = 2147483647
So it seems 9223372036854775807 rounds to 2147483647. This can be a real simplifier if your "big data" is getting too big.


This is an anti-pattern IMO and requires an import to opt-in (by the author, not the caller IIRC). Instead you'd use an implicit value class which should be zero overhead.

But that's just about the definition, not what it's doing which is no good. Anyone can abuse a lot of these features, which depending upon who you'd ask, is better flexibility than not having them at all.


You can write bad code in any language.


This is true, and in some ways I get annoyed by this saying, because it doesn't add much value.

I think there are two better ways of evaluating a language in terms of good code/bad code:

1. Does the language (and its idioms) encourage good or bad code?

2. Does the language support strong enough abstraction facilities to allow good code to be written?

Now, of course, the problem is that people will disagree about what counts as good code, but these at least make you think about what's important in a language.

Edit: Formatting


Really? Nice to know.

I had the feeling that "Programming in Scala" prized them as the killer feature, back in 2009 :)


Yes, you have to explicitly enable it otherwise it triggers a compiler error.


Explicitly enable implicit conversions? ´ Nice play, Mr. Odersky...


You have to explicitly enable the definition of new implicit conversions. Usage of implicit conversions is automatic.


However, they are heavily used in standard library, e.g. the dreadful CanBuildFrom


That's an implicit parameter, something entirely different. It is unfortunate that so many critics conflate the two.


Implicit parameters are not the same as implicit conversions.


Implicit parameters are no prize either, though. They seem to lead to massive, massive snarls in the codebases I end up exposed to regularly.


I've come to realize that there's no intuitive meaning to the infix tilde operator. Therefore, assigning it meaning is asking for trouble.

EDIT: added infix


> I think operator overloading and implicit conversions are the biggest problem.

Not everytime. Sometimes it could be great. Just look at some BigDecimal Code of Java and other languages.

in Scala (and Python) it's:

    BigDecimal(1.00) + BigDecimal(2.00)
in java it is:

    new BigDecimal(1.00).add(new BigDecima(2.00) and so on




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: