w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
Convert scala.collection.immutable.Vector to java.util.Collection
You need to call asJavaCollection to do the conversion, this should work: import scala.collection.JavaConverters._ var attrDefs = Vector(new AttributeDefinition(), new AttributeDefinition()) request.setAttributeDefinitions(attrDefs.asJavaCollection) As an alternative, you can use import scala.collection.JavaConversions._ to not have to call asJavaCollection. However, I find that it makes what the code is doing more readable to call the method. Here's the alternate example: import scala.collection.JavaConversions._ var attrDefs = Vector(new AttributeDefinition(), new AttributeDefinition()) request.setAttributeDefinitions(attrDefs)

Categories : Java

Scala convert Iterable or collection.Seq to collection.immutable.Seq
Use the to method to convert between arbitrary collection types in Scala 2.10: scala> Array(1, 2, 3).toSeq res0: Seq[Int] = WrappedArray(1, 2, 3) scala> Array(1, 2, 3).to[collection.immutable.Seq] res1: scala.collection.immutable.Seq[Int] = Vector(1, 2, 3)

Categories : Scala

Wrong collection.length when passing JSON array to Backbone Collection
If you dump one of the items returned by Google Spreadsheets, you will see that the data is nested in multiple objects, something like this { "id":{"$t":"https://spreadsheets.google.com/feeds/list/..."}, "updated":{"$t":"2013-07-30T12:01:24.000Z"}, "category":[{"scheme":"...","term":"..."}], "title":{"type":"text","$t":"ACIW"}, "content":{}, "link":[{"rel":"self","type":"application/atom+xml","href":"..."}] } In a Fiddle http://jsfiddle.net/nikoshr/kHBvY/ Note how the id property is wrapped in an object "id":{"$t":"https://spreadsheets.google.com/feeds/list/0AjbU8ta9j916dFdjSVg3YkNPUUJnWkZSWjBDWmZab3c/1/public/basic/cokwr"} Backbone collections don't allow duplicates and duplicates are determined based on their id. All your items are considered duplicates and

Categories : Javascript

Exception handling around shift in Scala
Just stating the obvious - It is a Scala type inference meets cps-annotation problem. The catch block do not contain any cps annotated expression. In this case the catch-block is expected to be of the same type as the try-block: Unit @cps[Unit] // same as Unit @cpsParam[Unit,Unit] To my experience, the type-inference and the CPS-transformation in Scala does not always work as expected and things that work in one version of Scala do not work in another version. There exists workarounds such as the try_protector mentioned in Scala Continuations - Why can't my shifted call be inside a try-catch block? Not sure if it helps in your case (i.e. Scala version 2.10.2).

Categories : Scala

Collection in Scala
Collection <?> args It's written Collection (pronounced "collection of unknown"), that is, a collection whose element type matches anything. It's called a wildcard type for obvious reasons. Read more about wildcards here For Scala Existential types and Usage of WildCards in Scala

Categories : Java

Scala Collection in JSF
This code is based on ScalaElResolver by Werner Punz. I have stripped it down, so it just handles the conversion from a Scala Iterable to a java.lang.Iterable: class SimpleScalaElResolver extends ELResolver { override def getValue(elContext: ELContext, base: AnyRef, prop: AnyRef): AnyRef = { println(s"SimpleElResolver: getValue: Entering: $base.$prop") if (base == null) { null } else { val method = base.getClass.getDeclaredMethod(prop.toString) if (method != null) { val res = method.invoke(base) if (res.isInstanceOf[Iterable[_]]) { val iter = res.asInstanceOf[Iterable[_]] println("getValue: Wrapping as Java iterable") elContext.setPropertyResolved(true) JavaConversions.asJavaI

Categories : Scala

Scala: exception handling in anonymous function
Perhaps this will give you some ideas: try { val someMap = someData.map { line => try { (line.split("\|")(0), // key line.split("\|")(1) + "|" + // value as string concat line.split("\|")(4) + "|" + line.split("\|")(9))) } catch { case inner: ArrayIndexOutOfBoundsException => { println("exception in " + line) throw inner; } } } } catch { case outer: ArrayIndexOutOfBoundsException => ... }

Categories : Debugging

Scala: Linux like pipe with error handling using Either
I think you can rewrite the second implicit like this: implicit class fromEither[A](val in: Either[String, A]) { def |>[B](f: A => B) = in.right.map(f) def |>>[B](f: A => Either[String, B]) = in.right.flatMap(f) } You can also make it extend AnyVal which should be slightly more performant. I would not even define the first implicit class. It is not much trouble to have to wrap the first element in Right, just like you do cat somefile. Note that there is actually a unix pipe-like API for running processes http://www.scala-lang.org/api/current/#scala.sys.process.package. If you want to take it to the next level, you can look at iteratees and enumeratees. See http://www.playframework.com/documentation/2.1.x/Enumeratees. It allows you to do something like this strings

Categories : Scala

Scala future sequence and timeout handling
There are a few things in your code here that you might want to reconsider. For starters, I'm not a huge fan of submitting tasks into the ExecutionContext that have the sole purpose of simulating a timeout and also have Thread.sleep used in them. The sleep call is blocking and you probably want to avoid having a task in the execution context that is purely blocking for the sake of waiting a fixed amount of time. I'm going to steal from my answer here and suggest that for pure timeout handling, you should use something like I outlined in that answer. The HashedWheelTimer is a highly efficient timer implementation that is mush better suited to timeout handling than a task that just sleeps. Now, if you go that route, the next change I would suggest concerns handling the individual timeou

Categories : Scala

Handling hierarchical collection with Linq
It would be easier to use this Node class. Node<MenuElement> rootNode = <Any node of the collection>.Root; var mandatoryNodes = rootNode.SelfAndDescendants.Where(n => n.Value.IsMandatory); foreach (var mandatoryNode in mandatoryNodes) { foreach (var node in mandatoryNode.SelfAndAncestors.Reverse()) // Reverse because you want from root to node { var spaces = string.Empty.PadLeft(node.Level); // Replaces: for(int i=0; i<node.HierarchicalLevel; i++) space = String.Concat(space, " "); Console.WriteLine("{0}{1}. {2}", spaces, node.Value.Position, node.Value.Label); } }

Categories : C#

Handling items in the collection pattern by a data mapper
I wouldn't think to actually use a factory object to add the articles. You may see yourself using one to make the instance of Article (in the second example), though. What I went ahead and did was add an addArticles () method to the ArticleCollection instance. This way you can simply call the method on your instance of ArticleCollection from the mapper. ArticleCollectionMapper may look something like: class ArticleCollectionMapper extends DataMapperAbstract { public function fetch ( ArticleCollection $articles ) { $prepare = $this->connection->prepare( "SELECT ..." ); $prepare->execute(); // filter conditions $articles->addArticles( $prepare->fetchAll() ); } } You'd need to do some filtering by getting the conditions from the A

Categories : PHP

scala, filter a collection based on several conditions
You could easily use an implicit class to give you this syntax: val strs = List("hello", "andorra", "trab", "world") def f1(s: String) = !s.startsWith("a") def f2(s: String) = !s.endsWith("b") val cond1 = true val cond2 = true implicit class FilterHelper[A](l: List[A]) { def ifFilter(cond: Boolean, f: A => Boolean) = { if (cond) l.filter(f) else l } } strs .ifFilter(cond1, f1) .ifFilter(cond2, f2) res1: List[String] = List(hello, world) I would have used if as the method name but it's a reserved word.

Categories : Scala

populate HashMap programatically from collection in Scala
I don't know exactly what you want to do, but is it something like this? scala> val mergeMap = Map(("key1", "value1"), ("key2", "value2"), ("key3", "value1")) mergeMap: scala.collection.immutable.Map[java.lang.String,java.lang.String] = Map(key1 -> value1, key2 -> value2, key3 -> value1) scala> mergeMap.values.toSet.map((_ : String, 1)).toMap res12: scala.collection.immutable.Map[String,Int] = Map(value1 -> 1, value2 -> 1) The first one makes a map from String to String (not from String to a Collection of Strings like in your example). The second one takes all the values from the map, and builds a new map with the values from the first map as keys and the default value 1 as each value. Duplicate keys are discarded (in the "toSet" step and would be

Categories : Scala

Reading a collection of lists from file in scala
The simplest way is to use scala.io.Source to read the file line by line. With getLines, you can retrieve an Iterator[String] over which you can map to split the lines and convert them to ints like this: val intPairs = Source.fromFile("/path/to/file").getLines.map { line => line.split(" ").take(2).map(_.toInt) } I leave the grouping of consecutive lines as an exercise for you.

Categories : Scala

How to write a Service cappable of handling multiple parameter types in Scala?
you could use sealed trait TransferObject[T]{... and case class TransferObjectA(data: Int) extends TransferObject[Int] ... and inside of def performUseCase[T](transferObjects: Iterable[TransferObject[T]]) some transferObject match{ case TransferObjectA(myInt) => ... ... }//throws warning because of unmatched TransferObjectB on compile time because of sealed trait also have a look at What is a sealed trait?

Categories : Scala

handling the creation of an object,which has a child ( ~ a collection of entities probably 1000000 or more)
The Spring Batch 'chunking' concept will support your retry and failure scenario as you've described. That is, you've created 500 records and a failure occurs and you don't want to lose what you've already got when you restart. A simple configuration for such a job might be as follows; <batch:job id="entityCreationJob"> <batch:step id="entityCreationJob.step1"> <batch:tasklet> <batch:chunk reader="entityReader" writer="entityWriter" commit-interval="250"/> </batch:tasklet> </batch:step> </batch:job> This simple configuration will do the following; - read/create a single record 'per row' (entity.getEntityManager(session).createEntity(e)) - 'commit' the record in 250 record blocks (set by the commit interval)

Categories : Java

Count occurrences of each item in a Scala parallel collection
If you want to make use of parallel collections and Scala standard tools, you could do it like that. Group your collection by the identity and then map it to (Value, Count): scala> val longList = List(1, 5, 2, 3, 7, 4, 2, 3, 7, 3, 2, 1, 7) longList: List[Int] = List(1, 5, 2, 3, 7, 4, 2, 3, 7, 3, 2, 1, 7) scala> longList.par.groupBy(x => x) res0: scala.collection.parallel.immutable.ParMap[Int,scala.collection.parallel.immutable.ParSeq[Int]] = ParMap(5 -> ParVector(5), 1 -> ParVector(1, 1), 2 -> ParVector(2, 2, 2), 7 -> ParVector(7, 7, 7), 3 -> ParVector(3, 3, 3), 4 -> ParVector(4)) scala> longList.pa

Categories : Scala

Scala: lazy evaluation on a Collection (Strategy Pattern)
I guess you mean Seq, not Map. You could use Seq instead of if sequence here, though I don't think its readable. You should use getOrElse after find. Assuming each lengthyOperationN is a function: seq.find(_._1).map{_._2}.getOrElse(lastLengthyOperation).apply()

Categories : Scala

`::` head and tail deconstruction of a java collection in scala
Using your JavaCollection extractor, here is how you can sum the first two elements without knowing the actual length of the collection: scala> val m = Map("a" -> Seq(1,2,3,4,5).asJava, "b" -> Seq(1,2).asJava).asJava m: java.util.Map[java.lang.String,java.util.List[Int]] = {a=[1, 2, 3, 4, 5], b=[1, 2]} scala> m.asScala.collect { case (k, JavaCollection(a, b, rest @ _*)) => k -> (a + b) } res3: scala.collection.mutable.Map[java.lang.String,Int] = Map(a -> 3, b -> 3) scala>

Categories : Java

How do I parameterize filtering a collection of Scala objects by type?
In val f = Y Y is not a type, but companion object. You could use type like this: type T = Y l.collect{case e: T=>e} // returns List[Y] = List(Y(1), Y(2)) Or you could use companion object, but only for certain parameters count: val t = Y l.collect{case e @ t(_)=>e} // returns List[Y] = List(Y(1), Y(2)) In this case you should use e @ t(_, _) for case class Y(y1:Int, y2:Int), e @ t(_, _, _) for case class Y(y1:Int, y2:Int, y3:Int) and so on.

Categories : Scala

Scala wrong implicit ambiguity
The real problem is that your intToView method fundamentally doesn't make sense and can't be usefully implemented. Right now it effectively says "give me an integer and tell me a subtype of View, and I'll give you an instance of that subtype". But this isn't possible, because I can invent some pretty messed-up classes that extend View. For example: class NamedView(val name: String) extends View The type signature of your intToView method promises that the following should work: val viewName: String = intToView[NamedView](13).name What could this possibly be? How could you implement intToView in such a way that it would know how to create a NamedView, which I've just made up on the spot? So ??? has allowed you to write a method that compiles but doesn't make sense. This is a dangero

Categories : Scala

Casting java.util.LinkedHashMap to scala.collection.mutable.Map
Importing the JavaConversions stuff doesn't make java's collection types instaces of the scala collection types, it provides handy conversion methods between the two distinct collection hierarchies. In this case, given the import in your question, you can get a mutable scala Map from your java LinkedHashMap with the line: val s = mapAsScalaMap(m)

Categories : Scala

How to make tree implemented in Scala useful with higher-order collection functions?
Let's rename FactType to something that looks more like a type parameter. I think naming it just T helps indicate it is a type parameter versus a meaningful class in your code: sealed abstract class FactsQueryAst[T] extends Traversable[T] So FactQueryAst contains things of type T and we want to be able to traverse the tree to do something for each t:T. The method to implement is: def foreach[U](f: T => U): Unit So replacing all FactType in your code with T and modifying the signature of T, I end up with: object FactsQueryAst { case class AndNode[T](subqueries: Seq[FactsQueryAst[T]]) extends FactsQueryAst[T] { def foreach[U](f: T => U) { subqueries foreach { _.foreach(f) } } } case class OrNode[T](subqueries: Seq[FactsQueryAst[T]]) extends FactsQueryAst[T] { def f

Categories : Scala

How to give the Scala compiler evidence that a collection has elements of the correct types?
You can enable the conversions with a simple set of implicits. class JValue implicit intToJValue(x: Int) = new JValue implicit stringToJValue(x: String) = new JValue val xs: List[JValue] = List(1, "hello") For your second question you can enable wholesale list conversion with: implicit def listToJList[A <% JValue](xs: List[A]): List[JValue] = xs def foo[A <% JValue](x: List[A]): List[JValue] = x This above example only works if you have a uniform type, otherwise you will need to employ a more sophisticated means, a list of heterogeneous types will unify to List[Any] in most cases. You could come up more elegant/complicated solutions using shapeless, most employing shapless.Poly and HList's.

Categories : Scala

scala - serialize Int to ArrayBuffer[Byte]. Bit twiddle goes wrong
Scala's toBinaryString method defers to the Java one on Integer. From those documents: public static String toBinaryString(int i) Returns a string representation of the integer argument as an unsigned integer in base 2. The unsigned integer value is the argument plus 2^32 if the argument is negative; otherwise it is equal to the argument. This value is converted to a string of ASCII digits in binary (base 2) with no extra leading 0s. In other words it's working as specified. Your bit-twiddling seems to be OK, but when you're printing the numbers out, you need to realise that the number of characters is dependent on the length of the data type. (E.g. -1: Int in binary is 11111111111111111111111111111111 while -1: Byte is 11111111.) You get away with it for positive numbe

Categories : Scala

What's wrong with this piece of code? Scala - tuple as function type
To do a pattern match (your case statements here), you need to tell the compiler what to match on: def find_host ( f : (String, Int) ) = f match { ... ^^^^^^^

Categories : Scala

xcode view collection cell wrong size
The issue is that the collection view flow layout is using the default size for its 'itemSize' property. Fortunately, this is easy to fix. Simply add the following code to your collection view's delegate: -(CGSize)collectionView:(UICollectionView *)collectionView layout:(UICollectionViewLayout *)collectionViewLayout sizeForItemAtIndexPath:(NSIndexPath *)indexPath { return CGSizeMake(<your width>, <your height>); }

Categories : Misc

Angular watch listener on filtered collection called at the wrong time
I'm not quite sure why it's doing that but It seems angular is deferring its digest cycle. Unless you want to go into debugging angular itself to find out the cause, I'd suggest two alternative. In most cases, manual watches are used when you do not want to change something on the scope itself (ie watch for a change and call another function, not modifying the scope). What you want to achieve is closely related to the concept of computed observable, but at the same time, you only return a property from another object without transforming it. Use the property directly In the template, you can directly use filteredItems.length. Because it's a number, angular will achieve stability in the digest cycle even though the underlying array changes. <p>filtered items length: {{filteredItem

Categories : Javascript

java garbage collection does a <(major collection:mark and sweep> every time it does a
I would try removing all the settings you have so that you have the absolute minimum and you are less likely to get a strange interaction between settings Try just -Xmx16g -XX:+UseConcMarkSweepGC and monitor it using jstat

Categories : Java

Convert from scala.collection.Seq to java.util.List in Java code
You're on the right track using JavaConversions, but the method you need for this particular conversion is seqAsJavaList: java.util.List<String> convert(scala.collection.Seq<String> seq) { return scala.collection.JavaConversions.seqAsJavaList(seq); }

Categories : Java

Create a collection of sub collection from a given Collection (Group By) in Java
Probably this is what you need until java 8 closures comes HashMap<Integer, DateTileGrid> dataGridMap = new HashMap<>(); for (Date date: dateList) { int year = date.getYear(); //Deprecated.. use something better DateTileGrid dataGrid = dataGridMap.get(year); if(dataGrid == null){ dataGrid = new DateTileGrid(); dataGrid.setCurrentYear(year); dataGrid.setDateTiles(new ArrayList<Date>()); dataGridMap.put(year, dataGrid); } dataGrid.getDateTiles().add(date); } //Here is your result ArrayList<DateTileGrid> result = new ArrayList<>(dataGridMap.values());

Categories : Java

android error handling, assertions or exception handling
An assertion is something that REALLY NEVER should happen. assertions are not for checking condition, but to check strong assumption, that you are sure would never gonna happen. when You use assertion, you enable them for debugging, but it should never be in production code. So, using assertion for null check are the worst idea because in java, null values are all around. better use if(object == null).

Categories : Java

Scala: How to transform a POJO like object into a SQL insert statement using Scala reflection
In your case, you can use the following codes: val o = new MyDataObj val attributes = o.getClass.getDeclaredMethods.filter { _.getReturnType != Void.TYPE }.map { method => (method.getName, method.getReturnType, method.invoke(o)) } Here I use getDeclaredMethods to get the public methods in the MyDataObj. You need to notice that getDeclaredMethods can not get methods in its parent class. For MyDataObj, getDeclaredMethods will return the following methods: public double MyDataObj.c() public boolean MyDataObj.b() public java.lang.String MyDataObj.d() public int MyDataObj.a() public void MyDataObj.c_$eq(double) public void MyDataObj.d_$eq(java.lang.String) public void MyDataObj.b_$eq(boolean) public void MyDataObj.a_$eq(int) So I add a filter to filter out irrelevant methods.

Categories : SQL

"scala.runtime in compiler mirror not found" but working when started with -Xbootclasspath/p:scala-library.jar
The easy way to configure the settings with familiar keystrokes: import scala.tools.nsc.Global import scala.tools.nsc.Settings def main(args: Array[String]) { val s = new Settings s processArgumentString "-usejavacp" val g = new Global(s) val r = new g.Run } That works for your scenario. Even easier: java -Dscala.usejavacp=true -jar ./scall.jar Bonus info, I happened to come across the enabling commit message: Went ahead and implemented classpaths as described in email to scala-internals on the theory that at this point I must know what I'm doing. ** PUBLIC SERVICE ANNOUNCEMENT ** If your code of whatever kind stopped working with this commit (most likely the error is something like "object scala not found") you can get it working again

Categories : Scala

"error: can't find main class scala.tools.nsc.MainGenericRunner" when running scala in windows
Those weird variables are called parameter extensions. they allow you to interpret a variable as a path to a file/directory and directly resolve things from that path. For example, if %1 is a path to a file dir123456file.txt, %~f1 is the fully qualified path to file.txt, %~p1 is the path to the containing directory dir123456, %~s1 is the path in short name format dir123~1file.txt, and many others... Also, %0 is always set to the path of the currently running script. So: %~fs0 is the fully qualified path, in short name format, to the current script, %%~dpsi is the manual expansion of the FOR variable %%i to a drive letter (d option) followed by the path the containing folder (p option), in short format (s option). Now, this weird looking block of code is a workaround for KB83343

Categories : Windows

scala Duration: "This class is not meant as a general purpose representation of time, it is optimized for the needs of scala.concurrent."
Time can be represented in various ways depending on your needs. I personally have used: Long — a lot of tools take it directly Updated: java.time.* thanks to @Vladimir Matveev The package is designed by the author of Joda Time (Stephen Colebourne). He says it is designed better. Joda Time java.util.Date Separate hierarchy of classes: trait Time case class ExactTime(timeMs:Long) extends Time case object Now extends Time case object ASAP extends Time case class RelativeTime(origin:Time, deltaMs:Long) extends Time Ordered time representation: case class History[T](events:List[T]) Model time. Once I had a global object Timer with var currentTime:Long: object Timer { private var currentTimeValue:Long def currentTimeMs = currentTimeValue def currentTimeMs_=(newTime:Long) { ...

Categories : Scala

Why use template engine in playframework 2 (scala) if we may stay with pure scala
Actually you should ask this question to the dev team, however consider few points: Actually you don't need to use the Play's templating engine at all, you can easily return any string with Ok() method, so according to your link you can just do something like Ok(theDate("John Doe").toString()) Play uses approach which is very typical for other MVC web-frameworks, where views are HTML based files, because... it's web dedicated framework. I can't see nothing wrong about this, sometimes I'm working with other languages/frameworks and can see that only difference in views between them is just a language-specific syntax, that's the goal! Don't also forget, that Play is bilingual system, someone could ask 'why don't use some Java lib for processing the views?' The built-in Scala XML literals a

Categories : Scala

Why can I start the Scala compiler with "java -cp scala-library.jar;. Hello World"?
Short version: you're not using the java compiler, you're using the java runtime. Long version: there's a big difference between javac and java. javac is the java compiler, which takes in java source code and outputs jvm bytecode. java is the java runtime, which takes in jvm bytecode and runs it. But one of the great things about the jvm is that you can generate bytecode for it any which way. Scala generates jvm bytecode without any java source code.

Categories : Java

Scala Macros: Making a Map out of fields of a class in Scala
Note that this can be done much more elegantly without the toString / c.parse business: import scala.language.experimental.macros abstract class Model { def toMap[T]: Map[String, Any] = macro Macros.toMap_impl[T] } object Macros { import scala.reflect.macros.Context def toMap_impl[T: c.WeakTypeTag](c: Context) = { import c.universe._ val mapApply = Select(reify(Map).tree, newTermName("apply")) val pairs = weakTypeOf[T].declarations.collect { case m: MethodSymbol if m.isCaseAccessor => val name = c.literal(m.name.decoded) val value = c.Expr(Select(c.resetAllAttrs(c.prefix.tree), m.name)) reify(name.splice -> value.splice).tree } c.Expr[Map[String, Any]](Apply(mapApply, pairs.toList)) } } Note also that you need the c.r

Categories : Scala

scala for each loop got error when convert java to scala
override def saveOrUpdateAll(entities: Collection[T]){ import scala.collection.JavaConverters._ val session: Session = getSession() for (entity <- entities.asScala) { session.saveOrUpdate(entity) } } There is no for each loop in scala. You should wrap your collection using JavaConverters and use for-comprehension here. JavaConverters wraps Collection using Wrappers.JCollectionWrapper without memory overhead.

Categories : Java



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.