w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
Scala: How to transform a POJO like object into a SQL insert statement using Scala reflection
In your case, you can use the following codes: val o = new MyDataObj val attributes = o.getClass.getDeclaredMethods.filter { _.getReturnType != Void.TYPE }.map { method => (method.getName, method.getReturnType, method.invoke(o)) } Here I use getDeclaredMethods to get the public methods in the MyDataObj. You need to notice that getDeclaredMethods can not get methods in its parent class. For MyDataObj, getDeclaredMethods will return the following methods: public double MyDataObj.c() public boolean MyDataObj.b() public java.lang.String MyDataObj.d() public int MyDataObj.a() public void MyDataObj.c_$eq(double) public void MyDataObj.d_$eq(java.lang.String) public void MyDataObj.b_$eq(boolean) public void MyDataObj.a_$eq(int) So I add a filter to filter out irrelevant methods.

Categories : SQL

Scala reflection and Squeryl
It's not really a Squeryl thing. As I understand it, the problem is that the pattern match testing is done at runtime, after type erasure has occurred. Scala is able to keep the type information around with the TypeTag and perform the runtime check but it can't infer that the types are correct at compile time. If you were to try something like case t: ClassTag[User] => users Which is asking the compiler to do a static check, you would get a warning that the User type is erased. The way you are doing it should work since it should be fine to perform the cast after you've verified the type, and I don't think there is a better way.

Categories : Scala

Abstract reflection API in Scala 2.10
You can just accept the universe as a parameter: class MyReflection(val u: scala.reflect.api.Universe) { import u._ def foo[T : TypeTag] = implicitly[TypeTag[T]].tpe.members } val x = new MyReflection(scala.reflect.runtime.universe) Note that you'll have to refer to the universe via your instance of MyReflection to get the path-dependent types right. val members: x.u.MemberScope = x.foo[String] Have a look at this question for more examples and options.

Categories : Scala

Scala Reflection Search?
You can use Mirrors to reflect on this and then invoke the desired method: import scala.reflect.runtime.universe._ import scala.reflect.runtime.currentMirror def depthFirstDebug(level:Int) { val mirror = currentMirror.reflect(this) val tpe = mirror.symbol.typeSignature for { m <- tpe.members if m.typeSignature <:< typeOf[CanDebug] if m.isTerm if !m.isMethod } { val fld = mirror.reflectField(m.asTerm) fld.get.asInstanceOf[CanDebug].depthFirstDebug(level + 1) } println("level "+level+" "+this.toString) } Now you can: val b =new B b.depthFirstDebug(0) Which gives you: level 1 A(World) level 1 A(Hello) level 0 Start

Categories : Scala

See annotations in Scala reflection
This is tricky! The annotation is not on the member of your class, but actually on the parameter in the apply method of your companion object! From your type, you should be able to get the companion object with: val companion = myType.typeSymbol.companionSymbol From there you can use reflection to look at the parameters to the apply method.

Categories : Scala

Scala Reflection of Nested List
This snippet shows matching to pull apart the type, and the two calls to showType show what you're doing and what you intend. val cType = sym.typeSignature def showType(t: Type): Unit = { if( t.typeSymbol.fullName.toString == "scala.collection.immutable.List" ) { t match { case PolyType(typeParams, resultType) => println(s"poly $typeParams, $resultType") case TypeRef(pre, sym, args) => println(s"typeref $pre, $sym, $args") val subtype = args(0) println("Sub:"+subtype) showType(subtype.typeSymbol.typeSignature) showType(subtype) } } } showType(cType)

Categories : List

How to use Java-Reflection to set the value of type scala.Enumeration
What is missing from the code is a way to convert the value from a string to a DayType. Kind of like the way the Integer is handled... Integer.parseInt(value). Try using the Value construct that is used in the DayType Enumeration.

Categories : Java

How can I use Scala reflection to find the self type traits?
Welcome to Scala version 2.10.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_10). Type in expressions to have them evaluated. Type :help for more information. scala> import scala.reflect.runtime.universe._ import scala.reflect.runtime.universe._ scala> :paste // Entering paste mode (ctrl-D to finish) trait Bar trait Dar trait Foo { self : Bar with Dar => } // Exiting paste mode, now interpreting. defined trait Bar defined trait Dar defined trait Foo scala> val selfTypeOfFoo = typeOf[Foo].typeSymbol.asClass.selfType selfTypeOfFoo: reflect.runtime.universe.Type = Foo with Bar with Dar If you want to inspect the self type further, you can match it against RefinedType: scala> val RefinedType(parents, _) = selfTypeOfFoo parents: List[reflect.runtime.universe.Type] =

Categories : Scala

Why is scala reflection reporting no declarations for certain classes?
The reason is that both members and declarations only take into account object members. However, all functions declared in java.lang.System are static. This makes sense because from the scala point of view there are no static members. The equivalent of a static member is a method/value defined in a module (using object instead of class). So scala-reflection will act as if static members of a Java-class are defined in a module --- more specifically in the companion object of the java-class. (Note that in contrast to scala defined companion objects these "java-companion-objects" do not exist on a VM level). I'm no expert in scala reflection, so I can't tell you how you would find the static members :-(

Categories : Scala

Using Scala reflection to find most derived runtime type
Most obvious solution would be to use the class: def checkType[A](item: A) { println("typeOf[A]: " + item.getClass) } But if you want to work with Type, then some additional work is needed: def checkType[A](item: A) { val mirror = runtimeMirror(this.getClass.getClassLoader) println("typeOf[A]: " + mirror.classSymbol(item.getClass).toType) }

Categories : Scala

Use Scala's reflection API for dynamic factory class instantiation
It's possible by telling the compiler to retain runtime types using an implicit ClassTag: object Factory { def apply[T <: A]()(implicit tag: ClassTag[T]) = { val A1 = classOf[A1] val A2 = classOf[A2] classTag[T].runtimeClass match { case A1 => new A1Factory() case A2 => new A2Factory() } }

Categories : Scala

instantiating a Scala class using reflection Java's `newInstance`
Obviously, you need to make the default constructor public (it won't work for Java classes without a public default constructor either). E.g. class ScalaClassYYY() { ... } or if you want primary constructor to take some arguments, class ScalaClassYYY(arg1: Int) { def this() = this(0) } But from Note the compiled Scala class name ending with $ it seems like you are actually trying to instantiate a Scala object: object ScalaClassYYY { ... } In this case, you shouldn't create a new instance and instead use the existing one: (IRegisterExecutor) registerClass.getField("MODULE$").get(null); EDIT: I don't see in your answer how you add a default public constructor to a Scala class that does NOT require any parameters. A class (not an object) that doesn't require any pa

Categories : Java

Deep access of fields in Scala using runtime reflection
You're exactly right, the type of your dos is Any. FieldMirror.symbol.typeSignature is what you'd get from typeOf[Dos]. So consider returning a pair (Any, Type) from unpack to have something to pass to unpack(target, type, name). Somewhat like: case class Uno(name: String, age: Int, pets: List[String], stuff: Dos) case class Dos(foo: String) object Boom extends App { import scala.reflect.runtime.universe._ import scala.reflect.runtime.{ currentMirror => cm } import scala.reflect.ClassTag val u = Uno("Marcus", 19, List("fish", "bird"), Dos("wow")) println("NAME: " + unpack(u, "name")) // Works println("PETS: " + unpack(u, "pets")) // Works // ----- Goes Boom ------- val (dos, dosT) = unpack(u, "stuff") println("Other: " + unpack(dos, dosT, "foo")) // Boom! ...or

Categories : Scala

Vlc and vlc web plugin on a cubox with arm architecture
http://www.solid-run.com/mw/index.php/CuBox_hardware_specification Their hardware specs specify it having armv6/armv7 compatiable cores so if the vlc is compiled tor those processors then yes it "should" play. There is only one way to find out and get a hold of their distribution of os (which is just a variant of popular linux distros) and install it in a vm machine that simulates the hardware or get an actual cubox. http://www.solid-run.com/mw/index.php/Category:Operating_System List/Comparison of used linux distros

Categories : Linux

Scala Reflection - Loading or finding classes based on trait
This is what ServiceLoader is for. I think the reflection API does make it easier to sort out what you need (i.e., for filtering but not for querying the class loader). If, by your phrase, "searching the loaded classes", you really mean classes that are already loaded, see this question for getting them. You could imagine a widgets library with an initializer that just ensures that all the widget classes it knows about are loaded. Then the client only needs to know the initializer. The type test is the same. val need = typeOf[Whatsit[Cog]] for (x <- (ServiceLoader load classOf[Whatsit[_]]).asScala) { val im = currentMirror reflect x if (im.symbol.toType weak_<:< need) Console println s"$x is what I need" else Console println s"$x is not what I need, I'm lookin

Categories : Scala

How to list all fields with a custom annotation using Scala's reflection at runtime?
This can be done with a TypeTag, by filtering through the members of your input type: import reflect.runtime.universe._ def listProperties[T: TypeTag]: List[(TermSymbol, Annotation)] = { // a field is a Term that is a Var or a Val val fields = typeOf[T].members.collect{ case s: TermSymbol => s }. filter(s => s.isVal || s.isVar) // then only keep the ones with a MyProperty annotation fields.flatMap(f => f.annotations.find(_.tpe =:= typeOf[MyProperty]). map((f, _))).toList } Then: scala> class A { @MyProperty("") val a = 1 ; @MyProperty("a") var b = 2 ; var c: Long = 1L } defined class A scala> listProperties[A] res15: List[(reflect.runtime.universe.TermSymbol, reflect.runtime.universe.Annotation)] = List((variable b,MyProperty("a")), (value a,MyPro

Categories : Scala

Plugin architecture in Spring web application
maybe you should try to use spring osgi with gemini? http://www.eclipse.org/gemini/blueprint/documentation/reference/1.0.2.RELEASE/html/app-deploy.html

Categories : Spring

C# Windows Service using reflection for plugin assemblies
Option 1. You can mark your plugin assembly with a special attribute and then check it before enumerating all types.To put an attribute on an assembly, in any code file of that assembly you can write: [assembly:YourSpecialAttributeClass] Option 2. You can also try separating assemblies in different folder, and hooking into AppDomain.AssemblyResolve event, manually searching for needed assembly in plugins and dependencies folder.

Categories : C#

Unit testing plugins in an application with plugin architecture
Seems like putting generic tests into a base class and deriving from it for each plugin is a viable way: public interface IPluginComponent { } [TestClass] public abstract class BaseTests { protected abstract IPluginComponent CreateComponent(); [TestMethod] public void SomeTest() { IPluginComponent component = this.CreateComponent(); // execute test } } public class MyPluginComponent : IPluginComponent { } [TestClass] public class MyPluginTests : BaseTests { protected IPluginComponent CreateComponent() { return new MyPluginComponent(); } [TestMethod] public void CustomTest() { // custom test } } Tests using MSTest should however be aware of a bug, where it is not possible to run tests in base class, if

Categories : C#

Can we use Google data plugin in scala?
The problem that's causing your error, is that the object Contacts does not have a main method. Instead, it contains an inner class called Test which has a main method. I don't believe that is what you want (in Scala, object methods are the equivalent of Java static methods), so the main method should be moved out into Contacts, and the inner class deleted. Also, for(i <-0 to entries.size()) is probably a mistake. This is roughly equivalent to for(int i=0; i<=entries.size(); i++) (notice the <=). You probably want for(i <-0 until entries.size()). While you're there, you can kill the try..catch blocks if you like, as Scala doesn't use checked exceptions. If you import scala.collection.JavaConversions._, then you can use for (entry <- entries), which may be less error prone.

Categories : Java

Netbeans scala plugin problems with ant
The netbeans error message could be worded more clearly. One issue is that you are using MS Window's syntax %SCALA_HOME% for evaluating your environment variable. The Unix syntax is $SCALA_HOME or ${SCALA_HOME}. The following works for me: First add the following line to .profile export SCALA_HOME=/usr/local/Cellar/scala/2.10.1/libexec and append the following to netbeans_default_options in netbeans.conf -J-Dscala.home=/usr/local/Cellar/scala/2.10.1/libexec I'd prefer to just list the scala home directory in one place to make upgrades more foolproof, but referencing ${SCALA_HOME} from netbeans.conf didn't work for me. Maybe the environment variable needs to be defined elsewhere to be visible from netbeans.conf

Categories : Scala

Scala Compiler Plugin Deconstruction
You're probably struggling with the cake pattern that the compiler is implemented with and a lot of path-dependency that comes with it. I've gone through this some time ago when I was writing some really beefy macro and wanted to refactor a bunch of functions out of macro implementation into separate utility class. I found this to be quite an annoying issue. Here's how I would implement your Traverser in a separate class: class MyPluginUtils[G <: Global with Singleton](global: G) { import global._ class AnalyzingTraverser extends ForeachTreeTraverser(tree => /* analyze */) } Now, inside your plugin you have to use it like this: val utils = new MyPluginUtils[global.type](global) import utils.{global => _, _} val traverser = new AnalyzingTraverser As you can see, it's n

Categories : Scala

Compile scala files from a sbt plugin
I suggest you have a look at sbt-boilerplate which is an sbt plugin that generates code, works well and is really simple. Here's a link to the file that you probably want to take a look at

Categories : Scala

Can't debug a Scala application in IntelliJ + sbt-idea-plugin
To debug just like to run, you need to create a run configuration (menu Run -> Edit Configurations). If you haven't done yet, you need to add an Application entry with the + button. Not only do you need to specify the main class, but also which "module" that class belongs to. By default, "Use classpath of module" will be empty. Here in the popup menu, you need to select the main module (not the one ending in "-build"). After you choose that and close with "Ok", it should work. Although not necessary, I also recommend to use sbt for building instead of "Make". In the configuration in the "Before launch" part, select "Make" and click on "-", then click on "+" and choose sbt -> test:products. Edit: Here is the reference for the SBT plugin for IntelliJ.

Categories : Scala

Results encoding in Scala Worksheet Eclipse plugin
This happens because JVM handles input strings as encoded with system default (cp1251 in my case) You should say to JVM that input strings are in UTF-8 To do this, put into your eclipse.ini: -Dfile.encoding=UTF8 If there is better (project local) solution, please let me know. This seems to be Windows-only related question.

Categories : Eclipse

Custom wro4j plugin for Scala's Simple Build Tool
This is not necessarily an answer for all of your questions, but an explanation for the reason xsbt wro4j plugin (and wro4j-maven-plugin) uses mockito. The wro4j was created initially as a runtime solution only (using HttpServletFilter) to minimize static resources on the fly. As result, the internal API is based on servlet-api (more specifically HttpServletRequest & HttpServletResponse objects). Later, when a build-time solution was required, instead of changing the internals of the framework, a suitable workaround was applied: using a mechanism for stubbing servlet-api in a non servlet environment (build-time). The way I see the long term approach: is to make wro4j, servlet-api agnostic and allow build-time solutions like maven plugin or xsbt plugin, to not require using this work

Categories : Scala

Eclipse(STS) break breakpoint not hit using maven surefire plugin and scala ide
If you have flexibility in your choice of IDEs then I'd switch to IntelliJ http://www.jetbrains.com/idea/ I've had issues like this in the past with maven-eclipse and while they are solvable, Intellij supports maven natively.

Categories : Java

Filter Method in searchView Widget for android scala eclipse plugin
Change from val books to var books And change val filteredResults: List[BookMetadata] = ListBuffer(books.asScala.toList.filter(b.startsWith(constraint.toString)): _*) to val filteredResults: List[BookMetadata] = ListBuffer(books.asScala.toList.filter(b => b.startsWith(constraint.toString)): _*) And please see Use of def, val, and var in scala

Categories : Java

How do I resolve "error: bad symbolic reference" for dependencies using maven-scala plugin?
This error: error: bad symbolic reference. A signature in Mapper.class refers to term runtime in package scala.reflect which is not available. is saying that scala.reflect.runtime is missing from the classpath. And, indeed, upon checking /tmp/scala-maven-6314934214401019063.args, it was not in the classpaths listed there. Slick 2.10 has dependencies on the Scala reflection package. See https://github.com/slick/slick/blob/master/src/main/scala/scala/slick/direct/MetadataProvider.scala. So, the POM for Slick should list scala-reflect so that other projects can resolve it as a transitive dependency. However, slick_2.10-1.0.1.pom does not list scala-reflect. Adding scala-reflect as a dependency in my own project POM fixed this.

Categories : Scala

Maven, maven-assembly-plugin and scala-library.jar file
Yes it's normal. If you include scala-library.jar as jar in your jar then you should include a custom code to load into your main(...). java -jar xxx.jar ignore jar inside jar you have to create your own assembly description file with <unpack>false</unpack> on dependencySet (under binaries or root, I don't remember) + change main to load the jar from Resource (your main() should note depends of scala-library) you can use alternative to assembly, dedicated to this job (it includes code to launch jar), like (maven plugin for those tools should exists): onejar launch4j But my recommendation, it's to use Proguard : it will create a single jar like you current code, but it will shrink useless code (lot of part of scala-library). Try it manually (without maven-plugin first)

Categories : Scala

Eclipse Plugin Development: Make plugin dependent on CDT-plugin
Fro your first question, simply add the CDT plugins to your Required plug-ins of your project. Details here. Editors and natures are different things, you cannot set an "editor as a nature". Probably what you're searching for is the C++ nature you want to set for a project programmatically. Simply search the nature's ID in the .project file of a C++ project, then you can start looking into the details.

Categories : Java

"error: can't find main class scala.tools.nsc.MainGenericRunner" when running scala in windows
Those weird variables are called parameter extensions. they allow you to interpret a variable as a path to a file/directory and directly resolve things from that path. For example, if %1 is a path to a file dir123456file.txt, %~f1 is the fully qualified path to file.txt, %~p1 is the path to the containing directory dir123456, %~s1 is the path in short name format dir123~1file.txt, and many others... Also, %0 is always set to the path of the currently running script. So: %~fs0 is the fully qualified path, in short name format, to the current script, %%~dpsi is the manual expansion of the FOR variable %%i to a drive letter (d option) followed by the path the containing folder (p option), in short format (s option). Now, this weird looking block of code is a workaround for KB83343

Categories : Windows

"scala.runtime in compiler mirror not found" but working when started with -Xbootclasspath/p:scala-library.jar
The easy way to configure the settings with familiar keystrokes: import scala.tools.nsc.Global import scala.tools.nsc.Settings def main(args: Array[String]) { val s = new Settings s processArgumentString "-usejavacp" val g = new Global(s) val r = new g.Run } That works for your scenario. Even easier: java -Dscala.usejavacp=true -jar ./scall.jar Bonus info, I happened to come across the enabling commit message: Went ahead and implemented classpaths as described in email to scala-internals on the theory that at this point I must know what I'm doing. ** PUBLIC SERVICE ANNOUNCEMENT ** If your code of whatever kind stopped working with this commit (most likely the error is something like "object scala not found") you can get it working again

Categories : Scala

scala Duration: "This class is not meant as a general purpose representation of time, it is optimized for the needs of scala.concurrent."
Time can be represented in various ways depending on your needs. I personally have used: Long — a lot of tools take it directly Updated: java.time.* thanks to @Vladimir Matveev The package is designed by the author of Joda Time (Stephen Colebourne). He says it is designed better. Joda Time java.util.Date Separate hierarchy of classes: trait Time case class ExactTime(timeMs:Long) extends Time case object Now extends Time case object ASAP extends Time case class RelativeTime(origin:Time, deltaMs:Long) extends Time Ordered time representation: case class History[T](events:List[T]) Model time. Once I had a global object Timer with var currentTime:Long: object Timer { private var currentTimeValue:Long def currentTimeMs = currentTimeValue def currentTimeMs_=(newTime:Long) { ...

Categories : Scala

Why can I start the Scala compiler with "java -cp scala-library.jar;. Hello World"?
Short version: you're not using the java compiler, you're using the java runtime. Long version: there's a big difference between javac and java. javac is the java compiler, which takes in java source code and outputs jvm bytecode. java is the java runtime, which takes in jvm bytecode and runs it. But one of the great things about the jvm is that you can generate bytecode for it any which way. Scala generates jvm bytecode without any java source code.

Categories : Java

Why use template engine in playframework 2 (scala) if we may stay with pure scala
Actually you should ask this question to the dev team, however consider few points: Actually you don't need to use the Play's templating engine at all, you can easily return any string with Ok() method, so according to your link you can just do something like Ok(theDate("John Doe").toString()) Play uses approach which is very typical for other MVC web-frameworks, where views are HTML based files, because... it's web dedicated framework. I can't see nothing wrong about this, sometimes I'm working with other languages/frameworks and can see that only difference in views between them is just a language-specific syntax, that's the goal! Don't also forget, that Play is bilingual system, someone could ask 'why don't use some Java lib for processing the views?' The built-in Scala XML literals a

Categories : Scala

scala for each loop got error when convert java to scala
override def saveOrUpdateAll(entities: Collection[T]){ import scala.collection.JavaConverters._ val session: Session = getSession() for (entity <- entities.asScala) { session.saveOrUpdate(entity) } } There is no for each loop in scala. You should wrap your collection using JavaConverters and use for-comprehension here. JavaConverters wraps Collection using Wrappers.JCollectionWrapper without memory overhead.

Categories : Java

Scala Macros: Making a Map out of fields of a class in Scala
Note that this can be done much more elegantly without the toString / c.parse business: import scala.language.experimental.macros abstract class Model { def toMap[T]: Map[String, Any] = macro Macros.toMap_impl[T] } object Macros { import scala.reflect.macros.Context def toMap_impl[T: c.WeakTypeTag](c: Context) = { import c.universe._ val mapApply = Select(reify(Map).tree, newTermName("apply")) val pairs = weakTypeOf[T].declarations.collect { case m: MethodSymbol if m.isCaseAccessor => val name = c.literal(m.name.decoded) val value = c.Expr(Select(c.resetAllAttrs(c.prefix.tree), m.name)) reify(name.splice -> value.splice).tree } c.Expr[Map[String, Any]](Apply(mapApply, pairs.toList)) } } Note also that you need the c.r

Categories : Scala

Scala import statement at top and inside scala class
The difference is: In Option 1 the import is viable for the complete scope. i.e any class/trait/function in com.somePackage can be used anywhere inside/outside the MyClass But in case of Option 2 it can only be used inside the MyClass and not outside it because scope of import is limited to inside MyClass only.

Categories : Scala

Scala: Use multiple constructors from Java in Scala
Now, with the help from a workmate, i have solved that problem. Instead of classOf[Button] i have to use classOf[Button].asInstanceOf[Class[_]] With this it works fine.

Categories : Java



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.