This project allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring XD. Just add @EnableBinding
and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bindings, which is automatic if the relevant binder implementation is available on the classpath. The sample uses Redis.
Here’s a sample source module (output channel only):
@SpringBootApplication
@ComponentScan(basePackageClasses=TimerSource.class)
public class ModuleApplication {
public static void main(String[] args) {
SpringApplication.run(ModuleApplication.class, args);
}
}
@Configuration
@EnableBinding(Source.class)
public class TimerSource {
@Value("${format}")
private String format;
@Bean
@InboundChannelAdapter(value = Source.OUTPUT, poller = @Poller(fixedDelay = "${fixedDelay}", maxMessagesPerPoll = "1"))
public MessageSource<String> timerMessageSource() {
return () -> new GenericMessage<>(new SimpleDateFormat(format).format(new Date()));
}
}
@EnableBinding
is parameterized by an interface (in this case Source
) which declares input and output channels. Source
, Sink
and Processor
are provided off the shelf, but you can define others. Here’s the definition of Source
:
public interface Source {
@Output("output")
MessageChannel output();
}
The @Output
annotation is used to identify output channels (messages leaving the module) and @Input
is used to identify input channels (messages entering the module). It is optionally parameterized by a channel name - if the name is not provided the method name is used instead. An implementation of the interface is created for you and can be used in the application context by autowiring it, e.g. into a test case:
@RunWith(SpringJUnit4ClassRunner.class)
@SpringApplicationConfiguration(classes = ModuleApplication.class)
@WebAppConfiguration
@DirtiesContext
public class ModuleApplicationTests {
@Autowired
private Source source
@Test
public void contextLoads() {
assertNotNull(this.source.output());
}
}
Note
|
In this case there is only one Source in the application context so there is no need to qualify it when it is autowired. If there is ambiguity, e.g. if you are composing one module from some others, you can use @Bindings qualifier to inject a specific channel set. The @Bindings qualifier takes a parameter which is the class that carries the @EnableBinding annotation (in this case the TimerSource ).
|
A module can have multiple input or output channels all defined either as @Input
and @Output
methods in an interface (preferrable) or as bean definitions. Instead of just one channel named "input" or "output" you can add multiple MessageChannel
methods annotated @Input
or @Output
and the names are converted to external channel names on the broker. The external channel names can be specified as properties that consist of the channel names prefixed with spring.cloud.stream.bindings
(e.g. spring.cloud.stream.bindings.input
or spring.cloud.stream.bindings.output
). External channel names can have a channel type as a colon-separated prefix, and the semantics of the external bus channel changes accordingly. For example, you can have two MessageChannels
called "output" and "foo" in a module with spring.cloud.stream.bindings.output=bar
and spring.cloud.stream.bindings.foo=topic:foo
, and the result is 2 external channels called "bar" and "topic:foo".
There are several samples, all running on the redis transport (so you need redis running locally to test them).
Note
|
The main set of samples are "vanilla" in the sense that they are not deployable as XD modules by the current generation (1.x) of XD. You can still interact with an XD system using the appropriate naming convention for input and output channel names (<stream>.<index> format).
|
-
source
is a Java config version of the classic "timer" module from Spring XD. It has a "fixedDelay" option (in milliseconds) for the period between emitting messages. -
sink
is a Java config version of the classic "log" module from Spring XD. It has no options (but some could easily be added), and just logs incoming messages at INFO level. -
transform
is a simple pass through logging transformer (just logs the incoming message and passes it on). -
double
is a combination of 2 modules defined locally (a source and a sink, so the whole app is self contained). -
extended
is a multi-module mashup ofsource | transform | transform | sink
, where the modules are defined in the other samples and referred to in this app just as dependencies.
If you run the source and the sink and point them at the same redis instance (e.g. do nothing to get the one on localhost, or the one they are both bound to as a service on Cloud Foundry) then they will form a "stream" and start talking to each other. All the samples have friendly JMX and Actuator endpoints for inspecting what is going on in the system.
Code using this library can be deployed as a standalone app or as an XD module. In standalone mode you app will run happily as a service or in any PaaS (Cloud Foundry, Lattice, Heroku, Azure, etc.). Depending on whether your main aim is to develop an XD module and you just want to test it locally using the standalone mode, or if the ultimate goal is a standalone app, there are some things that you might do differently.
The [input,output]ChannelName
are used to create physical endpoints in the external broker (e.g. queue.<channelName>
in Redis).
For an XD module the channel names are <group>.<index>
and a source (output only) has index=0
(the default) and downstream modules have the same group but incremented index, with a sink module (input only) having the highest index. To listen to the output from a running XD module, just use the same "group" name and an index 1 larger than the app before it in the chain.
Note: since the same naming conventions are used in XD, you can steal messages from or send messages to an existing XD stream by copying the stream name (to
spring.cloud.streams.group
) and knowing the index of the XD module you want to interact with.
To build the source you will need to install JDK 1.8.
Spring Cloud uses Maven for most build-related activities, and you should be able to get off the ground quite quickly by cloning the project you are interested in and typing
$ ./mvnw install
Note
|
You can also install Maven (>=3.3.3) yourself and run the mvn command
in place of ./mvnw in the examples below. If you do that you also
might need to add -P spring if your local Maven settings do not
contain repository declarations for spring pre-release artifacts.
|
Note
|
Be aware that you might need to increase the amount of memory
available to Maven by setting a MAVEN_OPTS environment variable with
a value like -Xmx512m -XX:MaxPermSize=128m . We try to cover this in
the .mvn configuration, so if you find you have to do it to make a
build succeed, please raise a ticket to get the settings added to
source control.
|
For hints on how to build the project look in .travis.yml
if there
is one. There should be a "script" and maybe "install" command. Also
look at the "services" section to see if any services need to be
running locally (e.g. mongo or rabbit). Ignore the git-related bits
that you might find in "before_install" since they’re related to setting git
credentials and you already have those.
The projects that require middleware generally include a
docker-compose.yml
, so consider using
Docker Compose to run the middeware servers
in Docker containers. See the README in the
scripts demo
repository for specific instructions about the common cases of mongo,
rabbit and redis.
Note
|
migration to the Maven wrapper (./mvnw ) is underway. If you
find a project that doesn’t have it yet, raise an issue to get it
added, and build with the command from .travis.yml (usually
mvn install -s .settings.xml ).
|
The spring-cloud-build module has a "docs" profile, and if you switch
that on it will try to build asciidoc sources from
src/main/asciidoc
. As part of that process it will look for a
README.adoc
and process it by loading all the includes, but not
parsing or rendering it, just copying it to ${main.basedir}
(defaults to ${basedir}
, i.e. the root of the project). If there are
any changes in the README it will then show up after a Maven build as
a modified file in the correct place. Just commit it and push the change.
If you don’t have an IDE preference we would recommend that you use Spring Tools Suite or Eclipse when working with the code. We use the m2eclipe eclipse plugin for maven support. Other IDEs and tools should also work without issue.
We recommend the m2eclipe eclipse plugin when working with eclipse. If you don’t already have m2eclipse installed it is available from the "eclipse marketplace".
Unfortunately m2e does not yet support Maven 3.3, so once the projects
are imported into Eclipse you will also need to tell m2eclipse to use
the .settings.xml
file for the projects. If you do not do this you
may see many different errors related to the POMs in the
projects. Open your Eclipse preferences, expand the Maven
preferences, and select User Settings. In the User Settings field
click Browse and navigate to the Spring Cloud project you imported
selecting the .settings.xml
file in that project. Click Apply and
then OK to save the preference changes.
Note
|
Alternatively you can copy the repository settings from .settings.xml into your own ~/.m2/settings.xml .
|
If you prefer not to use m2eclipse you can generate eclipse project metadata using the following command:
$ ./mvnw eclipse:eclipse
The generated eclipse projects can be imported by selecting import existing projects
from the file
menu.
Spring Cloud uses Project Lombok to generate getters and setters etc. Compiling from the command line this shouldn’t cause any problems, but in an IDE you need to add an agent to the JVM. Full instructions can be found in the Lombok website. The sign that you need to do this is a lot of compiler errors to do with missing methods and fields, e.g.
The method getInitialStatus() is undefined for the type EurekaInstanceConfigBean EurekaDiscoveryClientConfiguration.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/eureka line 120 Java Problem The method getInitialStatus() is undefined for the type EurekaInstanceConfigBean EurekaDiscoveryClientConfiguration.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/eureka line 121 Java Problem The method setNonSecurePort(int) is undefined for the type EurekaInstanceConfigBean EurekaDiscoveryClientConfiguration.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/eureka line 112 Java Problem The type EurekaInstanceConfigBean.IdentifyingDataCenterInfo must implement the inherited abstract method DataCenterInfo.getName() EurekaInstanceConfigBean.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/eureka line 131 Java Problem The method getId() is undefined for the type ProxyRouteLocator.ProxyRouteSpec PreDecorationFilter.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/zuul/filters/pre line 60 Java Problem The method getLocation() is undefined for the type ProxyRouteLocator.ProxyRouteSpec PreDecorationFilter.java /spring-cloud-netflix-core/src/main/java/org/springframework/cloud/netflix/zuul/filters/pre line 55 Java Problem
Spring Cloud is released under the non-restrictive Apache 2.0 license, and follows a very standard Github development process, using Github tracker for issues and merging pull requests into master. If you want to contribute even something trivial please do not hesitate, but follow the guidelines below.
Before we accept a non-trivial patch or pull request we will need you to sign the contributor’s agreement. Signing the contributor’s agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.
None of these is essential for a pull request, but they will all help. They can also be added after the original pull request but before a merge.
-
Use the Spring Framework code format conventions. If you use Eclipse you can import formatter settings using the
eclipse-code-formatter.xml
file from the Spring Cloud Build project. If using IntelliJ, you can use the Eclipse Code Formatter Plugin to import the same file. -
Make sure all new
.java
files to have a simple Javadoc class comment with at least an@author
tag identifying you, and preferably at least a paragraph on what the class is for. -
Add the ASF license header comment to all new
.java
files (copy from existing files in the project) -
Add yourself as an
@author
to the .java files that you modify substantially (more than cosmetic changes). -
Add some Javadocs and, if you change the namespace, some XSD doc elements.
-
A few unit tests would help a lot as well — someone has to do it.
-
If no-one else is using your branch, please rebase it against the current master (or other target branch in the main project).
-
When writing a commit message please follow these conventions, if you are fixing an existing issue please add
Fixes gh-XXXX
at the end of the commit message (where XXXX is the issue number).