Skip to content

Commit 7d9e93a

Browse files
authored
MINOR: Use https instead of http in links (apache#6477)
Verified that the https links work. I didn't update the license header in this PR since that touches so many files. Will file a separate one for that. Reviewers: Manikumar Reddy <[email protected]>
1 parent 172fbb2 commit 7d9e93a

11 files changed

+31
-31
lines changed

CONTRIBUTING.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
## Contributing to Kafka
22

3-
*Before opening a pull request*, review the [Contributing](http://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages.
3+
*Before opening a pull request*, review the [Contributing](https://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages.
44

55
It lists steps that are required before creating a PR.
66

NOTICE

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ Apache Kafka
22
Copyright 2019 The Apache Software Foundation.
33

44
This product includes software developed at
5-
The Apache Software Foundation (http://www.apache.org/).
5+
The Apache Software Foundation (https://www.apache.org/).
66

77
This distribution has a binary dependency on jersey, which is available under the CDDL
88
License. The source code of jersey can be found at https://github.com/jersey/jersey/.

README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
Apache Kafka
22
=================
3-
See our [web site](http://kafka.apache.org) for details on the project.
3+
See our [web site](https://kafka.apache.org) for details on the project.
44

5-
You need to have [Gradle](http://www.gradle.org/installation) and [Java](http://www.oracle.com/technetwork/java/javase/downloads/index.html) installed.
5+
You need to have [Gradle](https://www.gradle.org/installation) and [Java](https://www.oracle.com/technetwork/java/javase/downloads/index.html) installed.
66

77
Kafka requires Gradle 5.0 or higher.
88

@@ -19,7 +19,7 @@ Now everything else will work.
1919
### Build a jar and run it ###
2020
./gradlew jar
2121

22-
Follow instructions in http://kafka.apache.org/documentation.html#quickstart
22+
Follow instructions in https://kafka.apache.org/documentation.html#quickstart
2323

2424
### Build source jar ###
2525
./gradlew srcJar
@@ -209,4 +209,4 @@ See [vagrant/README.md](vagrant/README.md).
209209
Apache Kafka is interested in building the community; we would welcome any thoughts or [patches](https://issues.apache.org/jira/browse/KAFKA). You can reach us [on the Apache mailing lists](http://kafka.apache.org/contact.html).
210210

211211
To contribute follow the instructions here:
212-
* http://kafka.apache.org/contributing.html
212+
* https://kafka.apache.org/contributing.html

build.gradle

+4-4
Original file line numberDiff line numberDiff line change
@@ -189,11 +189,11 @@ subprojects {
189189
pom.project {
190190
name 'Apache Kafka'
191191
packaging 'jar'
192-
url 'http://kafka.apache.org'
192+
url 'https://kafka.apache.org'
193193
licenses {
194194
license {
195195
name 'The Apache Software License, Version 2.0'
196-
url 'http://www.apache.org/licenses/LICENSE-2.0.txt'
196+
url 'https://www.apache.org/licenses/LICENSE-2.0.txt'
197197
distribution 'repo'
198198
}
199199
}
@@ -1420,7 +1420,7 @@ project(':connect:api') {
14201420

14211421
javadoc {
14221422
include "**/org/apache/kafka/connect/**" // needed for the `javadocAll` task
1423-
options.links "http://docs.oracle.com/javase/7/docs/api/"
1423+
options.links "https://docs.oracle.com/javase/8/docs/api/"
14241424
}
14251425

14261426
tasks.create(name: "copyDependantLibs", type: Copy) {
@@ -1699,5 +1699,5 @@ task aggregatedJavadoc(type: Javadoc) {
16991699
classpath = files(projectsWithJavadoc.collect { it.sourceSets.main.compileClasspath })
17001700
includes = projectsWithJavadoc.collectMany { it.javadoc.getIncludes() }
17011701
excludes = projectsWithJavadoc.collectMany { it.javadoc.getExcludes() }
1702-
options.links "http://docs.oracle.com/javase/7/docs/api/"
1702+
options.links "https://docs.oracle.com/javase/8/docs/api/"
17031703
}

doap_Kafka.rdf

+10-10
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22
<?xml-stylesheet type="text/xsl"?>
33
<rdf:RDF xml:lang="en"
44
xmlns="http://usefulinc.com/ns/doap#"
5-
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
6-
xmlns:asfext="http://projects.apache.org/ns/asfext#"
7-
xmlns:foaf="http://xmlns.com/foaf/0.1/">
5+
xmlns:rdf="https://www.w3.org/1999/02/22-rdf-syntax-ns#"
6+
xmlns:asfext="https://projects.apache.org/ns/asfext#"
7+
xmlns:foaf="https://xmlns.com/foaf/0.1/">
88
<!--
99
Licensed to the Apache Software Foundation (ASF) under one or more
1010
contributor license agreements. See the NOTICE file distributed with
@@ -21,22 +21,22 @@
2121
See the License for the specific language governing permissions and
2222
limitations under the License.
2323
-->
24-
<Project rdf:about="http://kafka.apache.org/">
24+
<Project rdf:about="https://kafka.apache.org/">
2525
<created>2014-04-12</created>
2626
<license rdf:resource="http://usefulinc.com/doap/licenses/asl20" />
2727
<name>Apache Kafka</name>
28-
<homepage rdf:resource="http://kafka.apache.org/" />
29-
<asfext:pmc rdf:resource="http://kafka.apache.org" />
28+
<homepage rdf:resource="https://kafka.apache.org/" />
29+
<asfext:pmc rdf:resource="https://kafka.apache.org" />
3030
<shortdesc>Apache Kafka is a distributed, fault tolerant, publish-subscribe messaging.</shortdesc>
3131
<description>A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients. Kafka is designed to allow a single cluster to serve as the central data backbone for a large organization. It can be elastically and transparently expanded without downtime. Data streams are partitioned and spread over a cluster of machines to allow data streams larger than the capability of any single machine and to allow clusters of co-ordinated consumers. Kafka has a modern cluster-centric design that offers strong durability and fault-tolerance guarantees. Messages are persisted on disk and replicated within the cluster to prevent data loss. Each broker can handle terabytes of messages without performance impact.</description>
3232
<bug-database rdf:resource="https://issues.apache.org/jira/browse/KAFKA" />
33-
<mailing-list rdf:resource="http://kafka.apache.org/contact.html" />
34-
<download-page rdf:resource="http://kafka.apache.org/downloads.html" />
33+
<mailing-list rdf:resource="https://kafka.apache.org/contact.html" />
34+
<download-page rdf:resource="https://kafka.apache.org/downloads.html" />
3535
<programming-language>Scala</programming-language>
36-
<category rdf:resource="http://projects.apache.org/category/big-data" />
36+
<category rdf:resource="https://projects.apache.org/projects.html?category#big-data" />
3737
<repository>
3838
<SVNRepository>
39-
<location rdf:resource="http://git-wip-us.apache.org/repos/asf/kafka.git"/>
39+
<location rdf:resource="https://gitbox.apache.org/repos/asf/kafka.git"/>
4040
<browse rdf:resource="https://github.com/apache/kafka"/>
4141
</SVNRepository>
4242
</repository>

gradle/buildscript.gradle

+1-1
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ repositories {
1717
repositories {
1818
// For license plugin.
1919
maven {
20-
url 'http://dl.bintray.com/content/netflixoss/external-gradle-plugins/'
20+
url 'https://dl.bintray.com/content/netflixoss/external-gradle-plugins/'
2121
}
2222
}
2323
}

jmh-benchmarks/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
###JMH-Benchmark module
22

3-
This module contains benchmarks written using [JMH](http://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK.
3+
This module contains benchmarks written using [JMH](https://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK.
44
Writing correct micro-benchmarks is Java (or another JVM language) is difficult and there are many non-obvious pitfalls (many
55
due to compiler optimizations). JMH is a framework for running and analyzing benchmarks (micro or macro) written in Java (or
66
another JVM language).
77

8-
For help in writing correct JMH tests, the best place to start is the [sample code](http://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided
8+
For help in writing correct JMH tests, the best place to start is the [sample code](https://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided
99
by the JMH project.
1010

1111
Typically, JMH is expected to run as a separate project in Maven. The jmh-benchmarks module uses

release_notes.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -98,16 +98,16 @@ def issue_type_key(issue):
9898

9999
print "<h1>Release Notes - Kafka - Version %s</h1>" % version
100100
print """<p>Below is a summary of the JIRA issues addressed in the %(version)s release of Kafka. For full documentation of the
101-
release, a guide to get started, and information about the project, see the <a href="http://kafka.apache.org/">Kafka
101+
release, a guide to get started, and information about the project, see the <a href="https://kafka.apache.org/">Kafka
102102
project site</a>.</p>
103103
104104
<p><b>Note about upgrades:</b> Please carefully review the
105-
<a href="http://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly
105+
<a href="https://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly
106106
before upgrading your cluster. The upgrade notes discuss any critical information about incompatibilities and breaking
107107
changes, performance changes, and any other changes that might impact your production deployment of Kafka.</p>
108108
109109
<p>The documentation for the most recent release can be found at
110-
<a href="http://kafka.apache.org/documentation.html">http://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless }
110+
<a href="https://kafka.apache.org/documentation.html">https://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless }
111111
for itype, issues in by_group:
112112
print "<h2>%s</h2>" % itype
113113
print "<ul>"

tests/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -357,7 +357,7 @@ For a tutorial on how to setup and run the Kafka system tests, see
357357
https://cwiki.apache.org/confluence/display/KAFKA/tutorial+-+set+up+and+run+Kafka+system+tests+with+ducktape
358358

359359
* Install Virtual Box from [https://www.virtualbox.org/](https://www.virtualbox.org/) (run `$ vboxmanage --version` to check if it's installed).
360-
* Install Vagrant >= 1.6.4 from [http://www.vagrantup.com/](http://www.vagrantup.com/) (run `vagrant --version` to check if it's installed).
360+
* Install Vagrant >= 1.6.4 from [https://www.vagrantup.com/](https://www.vagrantup.com/) (run `vagrant --version` to check if it's installed).
361361
* Install system test dependencies, including ducktape, a command-line tool and library for testing distributed systems. We recommend to use virtual env for system test development
362362

363363
$ cd kafka/tests
@@ -401,12 +401,12 @@ Preparation
401401
In these steps, we will create an IAM role which has permission to create and destroy EC2 instances,
402402
set up a keypair used for ssh access to the test driver and worker machines, and create a security group to allow the test driver and workers to all communicate via TCP.
403403

404-
* [Create an IAM role](http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines.
404+
* [Create an IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines.
405405
- Create role "kafkatest-master"
406406
- Role type: Amazon EC2
407407
- Attach policy: AmazonEC2FullAccess (this will allow our test-driver to create and destroy EC2 instances)
408408

409-
* If you haven't already, [set up a keypair to use for SSH access](http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose
409+
* If you haven't already, [set up a keypair to use for SSH access](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose
410410
of this quickstart, let's say the keypair name is kafkatest, and you've saved the private key in kafktest.pem
411411

412412
* Next, create a EC2 security group called "kafkatest".

tests/bootstrap-test-env.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ echo "Checking Vagrant installation..."
3636
vagrant_version=`vagrant --version | egrep -o "[0-9]+\.[0-9]+\.[0-9]+"`
3737
bad_vagrant=false
3838
if [ "$(version $vagrant_version)" -lt "$(version 1.6.4)" ]; then
39-
echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see http://www.vagrantup.com for details)"
39+
echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see https://www.vagrantup.com for details)"
4040
bad_vagrant=true
4141
else
4242
echo "Vagrant installation looks good."

vagrant/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
Using Vagrant to get up and running.
44

55
1) Install Virtual Box [https://www.virtualbox.org/](https://www.virtualbox.org/)
6-
2) Install Vagrant >= 1.6.4 [http://www.vagrantup.com/](http://www.vagrantup.com/)
6+
2) Install Vagrant >= 1.6.4 [https://www.vagrantup.com/](https://www.vagrantup.com/)
77
3) Install Vagrant Plugins:
88

99
$ vagrant plugin install vagrant-hostmanager

0 commit comments

Comments
 (0)